You are here

It's time to get ready for HTTP/2

public://pictures/esherman_0.png
Erik Sherman, Journalist, Independent

 

HTTP/2 might make the web faster, but so far the shift to HTTP/2 has been slow. According to W3Techs.com, only 1.4 percent of websites use HTTP/2.

That's likely to change in 2016 as the major web servers incorporate HTTP/2 into their regular releases. Although the shift to HTTP/2 won't be horrendous, you do need to update your design strategies.

[ Get Report: Gartner: Use Adaptive Release Governance to Remove DevOps Constraints ]

The HTTP/1.1 problem

"In the 90s, when HTTP was designed, we were on telephony modems that were very slow, bandwidth-wise, compared to what people were on today," says Ragnar Lonn, founder of testing company Load Impact. "Web pages have gotten more complicated and heavier. But network delay has not improved a lot," he says, even as bandwidth has radically increased in many parts of the world.

Web pages have gotten heavier, but network delay has not improved a lot.
Ragnar Lonn, founder, Load Impact

The combination of complex pages and network latency has become a huge online speed bump. According to the HTTP Archive, the average web page today is well over 2 megabytes in size and requires about 100 HTTP requests and 20 JavaScript requests.

Although bandwidth has increased, "delay is much harder to improve," Lonn says. A 50 millisecond latency and average number of requests could translate into a 10-second load time for the average web page.

The reason is the tight funneling of requests to the server. "We open at most six connections per origin [for HTTP 1.1]," says Ilya Grigorik, a web performance engineer at Google and author of High Performance Browser Networking. "That's a pretty significant source of latency [as requests line up to be processed]," he says.

[ Get Report: The Top 20 Continuous Application Performance Management Companies ]

Workarounds don't always work

Developers used many tricks to work around the imposed limits as websites grew in size. The techniques split into two categories: expanding the number of potential requests and reducing the count of individual downloaded files. Common examples include:

  • Domain sharding places page resources onto different servers to expand the number of origins and the number of effective connections that can be made for a single page.
  • Inline linking adds resources from other sites.
  • Image sprites combine multiple images into a single one that the web page then shows, one section at a time, to create the impression of different images.
  • Concatenation combines JavaScript files into a single one that, like sprites, is loaded only once.

However, these techniques "can have pretty bad repercussions," Grigorik says. Combined files meant everything was downloaded, no matter how little the browser actually needed, increasing network traffic. Any change to one part of a sprite or concatenated JavaScript file meant reloading the entire compilation.

Sharding and inline links caused different problems. "Every single connection requires memory resources, buffers, and stuff on both sides," Lonn says. Desktop machines might have the extra capacity, but mobile devices such as phones and tablets generally weren't so tricked out. Servers, as hubs in the process, needed ever-increasing computing resources to handle the growing connections.

Simultaneous connections also required that someone track the total flow of data to avoid problems with resource timing and dependencies. Plus, HTTP/1.1 request headers are always uncompressed and run 800 bytes. "If you multiply that out by hundreds of requests, it adds up pretty quickly," says Grigorik.

SPDY and HTTP/2 provide an answer

The twin growth of Internet use and web page complexity was slowing everything down. A few years back, during the development of the Chrome browser, Google realized that there was a problem. "That was the ah-ha moment," Grigorik says. "We needed to rethink the HTTP protocol."

Google developed SPDY, the company's attempt to improve HTTP, and submitted it as a candidate for the next revision of HTTP. "Ultimately, one of the earlier SPDY drafts was adopted," he says.

Many features should help speed web performance. One connection can handle multiple requests in a multiplexed fashion, as the server effectively adds a packet wrapper to track the request and order of data within each request. The browser can then ensure all the downloaded content is received correctly. "With HTTP/2, you can multiplex as many files as you want," says Grigorik. "It means you have to do less as a web developer."

With HTTP/ 2, you can multiplex as many files as you want. It means you have to do less as a web developer.
Ilya Grigorik, web performance engineer, Google

"By using one connection per origin, all of a sudden you have a much more network-friendly protocol," says Mark Nottingham, principal architect at Akamai and chair of the IETF HTTP Working Group.

Under HTTP/2, a site can actively push out content that it expects will be requested by a user. That can speed loading by eliminating added back-and-forth latency. Browsers, in turn, can decline to load content they don't need. Servers can also prioritize content. For example, a server might send images above the fold first, so they appear on the screen as others below are loaded to be available when someone scrolls down. Headers are compressed as well to reduce bandwidth consumption.

By using one connection per origin, you have a much more network friendly protocol.

Mark Nottingham, chair of the HTTP Working Group

As for mobile, "the benefits are more pronounced," says Nottingham. "Mobile apps are very interested in the protocol."

But there is no single set number that will indicate the improvement a site might see. Google's sites have seen between 10 percent and 50 percent improvement, according to Grigorik. "There are a lot of variables," he says.

Adoption of HTTP/2

At the moment, SPDY has some adoption advantages over HTTP/2, but that will change. According to Owen Garrett, head of products at NGINX, one of the major web server vendors, about 80 percent of browsers in use currently support SPDY, compared to 60 percent for HTTP/2.

"HTTP/2 includes many of the capabilities of SPDY," Garrett says. "We're seeing a slow uptake of HTTP/2 for a couple of reasons. First, it's still a very early emerging standard. It's not supported by as many browsers. And there are some challenges supporting it in servers." For example, HTTP/2 requires the 1.02 build of SSL, which is not widely distributed.

There are some challenges supporting HTTP/2 in servers.
Owen Garrett, head of products at NGINX

As vendors ship products with full HTTP/2 support early in 2016, the push to switch will become strong. "We're already seeing numbers from people like Mozilla showing that their telemetry shows 14 percent or 15 percent of the requests being made now are HTTP/2," Nottingham says. But change brings challenges.

"It's a complicated situation with lots of moving parts and issues," says Garrett. "Organizations should be planning now how they're going to support HTTP/2."

2016 puts a deadline on deployment

Given Google's plans to drop support for SPDY in early 2016, HTTP/2 will effectively become the only game in town. But that doesn't mean all the web will move to the new version.

"I don't think anyone is going to tell you that we're going to shift all traffic to HTTP/2," Nottingham says. Embedded web servers and other types of implementations that receive infrequent updates mean that not everyone will have access to the new standard. Backward compatibility in HTTP means that browsers and servers will continue to support HTTP/1.1. But there will be trade-offs in trying to keep everyone happy.

"There are other decisions to be made, like how you do prioritizations of responses," Nottingham says. "You won't get the full performance benefits in HTTP/2 if you use [HTTP/1.1 workaround techniques]. You can remove all of those optimizations for HTTP/2, knowing legacy clients won't get the great experience."

Max Sullivan, lead web developer at ARTCO by J, says that although his company has experimented with HTTP/2, it's waiting for Microsoft to release Windows Server 2016 before placing the new standard in production. "I think it's very important for anybody in this business," he says. "Overall we're looking at reducing a lot of network traffic. We have a lot of bandwidth available but [there's] always latency."

Overall we're looking at reducing a lot of network traffic.
—Max Sullivan, chief web developer at ARTCO by J

Testing is key to the HTTP/2 transition

Rollout will be a matter of testing. "We'll have to do a balance between both to see what works best," Sullivan says. The real question will be for existing websites that make use of HTTP/1.1 workarounds.

Changing sharding is relatively easy, but other modifications are harder. "Personally, for sites that use spriting, I'd leave them as it is now and not rework because there is more involved with that," says Sullivan. Changes to JavaScript concatenation will be a case-by-case decision. "As we move into it, it's more looking at the cost/benefit for taking the time to redo it."

One consideration is that while the HTTP/2 specification doesn't require encryption, all the browser implementations do. "It will be a little bit more difficult to debug protocol level," he says. However, many sites are already encrypted. "It would be the same approach any developer would use now with an encrypted site. There's no additional equipment. The software most people use is open source. You just can't see it all in plain text and for a lot of people they're not dealing with that anyway."

A slow but inexorable tide

In all, the shift to HTTP/2, while coming, will take time to digest. "It will take a while to build up the best practices and tools," Nottingham predicts. That will happen at conferences, communities, and through practical experience. Developers will need to test performance and make decisions about return on investment in reworking sites. However, now is the time to experiment, because the HTTP/2 tide is coming, and having to deal with it is a certainty.

 

[ Get Report: Buyer’s Guide to Software Test Automation Tools ]

Article Tags