What’s So Great About HTTP 2.0?

Posted by Rachel Gillevet on January 20, 2014

5065834411_d12669d487_bThe way we use the web has changed radically over the last few years. Where once a web page was mainly comprised of a few static resources that were loaded once and from a small number of servers, many pages are now complex combinations of resources loaded from multiple servers with extensive interactive components.

HTTP, the Hypertext Transfer Protocol, was designed for an older web and was not built with today’s complex interactive sites in mind. While HTTP 1.1, the current version, has given great service, it’s time for something new. HTTP’s old-fashioned assumptions about the nature of the web causes sites to be slower and less responsive than they should be, forcing browser manufacturers and web developers to implement hacky solutions to work around its deficiencies.

The creation of the next version of HTTP, HTTP 2.0, is well underway, and even if the protocol doesn’t see widespread adoption in 2014, it’ll certainly be on the minds of many developers and web hosting companies.

HTTP 2.0 is based on Google’s SPDY protocol, which was designed to help mitigate some of the less helpful features of HTTP 1.1.

So, what can we expect from HTTP 2.0?


One of the reasons HTTP 1.1 is slow and resource intensive is because of the way it handles streams. As we’ve already said, web pages used to be much simpler. HTTP 1.1 was not designed to handle the multiple streams of data that modern sites require. HTTP 2.0 includes the ability to multiplex streams, which means that servers and clients no longer have to open multiple TCP connections. Multiple requests can be sent over the same connection at the same time, and responses can be sent in an intelligent order (prioritization), rather than having to be responded to in the order in which they are received. The result should be lower resource usage on both ends of the connection and significantly reduced latency.

Header Compression

Previously, only the resource and some other frames of an HTTP connection could be compressed. In HTTP 2.0, the headers, which carry various bits of information, can also be compressed, reducing the amount of data that needs to be sent.

Server Push

Server push allows the server to send multiple parallel responses to a single client request.

Consider how a web server currently handles loading a web page. It receives multiple connections for each resource: HTML, CSS, JavaScript, etc. Each of those connections uses server resources and time. One way round this in inlining, where all of those resources are included into one page that can be sent over one connection. That works, but a drawback is that those resources cannot then be cached and used on multiple pages, which somewhat negates the value of inlining.

With HTTP 2.0, all those resources can be sent on the same connection without the problems caused by inlining, which will reduce latency and resource use.

Mandatory Encryption

In response to last year’s controversy about security and privacy online, the HTTPbis working group have decided that HTTP 2.0 will only work with HTTPS connections. Unless a connection is using TLS (a.k.a. SSL), it will not be able to use HTTP 2.0. It appears that the goal is to use the other benefits of HTTP 2.0 as an incentive to encourage wider adoption of encryption. It’s somewhat controversial decision, so we’ll have to keep an eye on what happens over the next year.

Those are the main points that developers and web hosts need to know, but if you’d like more detailed information, take a look at the complete HTTP 2.0 Specifications.

Last updated by at .

  • transpar3nt

    Encouraging encryption is a very positive thing but it should not be forced in order to use the benefits of the spec. There are a ton of websites that have no need for encryption at all, so the cost of buying a certificate and the server resources required to perform the encryption would be a waste.

    • http://www.wiredtree.com/ Rachel Gillevet

      You make a valid point. It will definitely be interesting to see what happens with the mandatory encryption decision.

    • Ian Armstrong

      I mean, StartSSL provides free certification. It’s not authoritative but it’s encrypted. Ultimately, my guess is that the providers will pick up the slack since the efficiency of HTTP2 will considerably outweigh the cost of automating SSL certificates. The larger hosts (EIG, GoDaddy, Rackspace, etc) will almost certainly go into the business of providing SSL, though the smaller hosts (WiredTree, LiquidWeb, etc) will continue to rely on the existing shops.

      That’s the thing about web standards. It’s almost never the end-user that picks up the complexity. Instead, that complexity is passed back to the market as increased price. In this case, I doubt you’ll even see that – since the 2.0 spec dramatically reduces load and that is the primary cost of doing business.