Reducing HTTP latency with Quick UDP Internet Connections (QUIC)

The QUIC Protocol

HTTP over UDP: an experimental investigation of QUIC
  • Gaetano Carlucci, Luca De Cicco, Saverio Mascolo
    HTTP over UDP: an experimental investigation of QUIC
    Proc. of 30th ACM/SIGAPP Symposium On Applied Computing (SAC 2015), Salamanca, Spain, April 2015 (PDF) (Slides:PDF)

This paper investigates "Quick UDP Internet Connections"(QUIC), which was proposed by Google in 2012 as a reliable protocol on top of UDP in order to reduce Web Page retrieval time. We first check, through experiments, if QUIC can be safely deployed in the Internet and then we evaluate the Web page load time in comparison with SPDY and HTTP. We have found that QUIC reduces the overall page retrieval time with respect to HTTP in case of a channel without induced random losses and outperforms SPDY in the case of a lossy channel.


The evolution of the Web in the last decade has given end users a remarkable improvement in the Web navigation experience. In order to transport a Web page the stateless HTTP protocol is employed. The first documented version of HTTP was HTTP V0.9 1 in 1991. HTTP V0.9 was designed to provide a single document request, which was sufficient to deliver a single hypertext document over a TCP connection. The advent of the HTML has changed the Web page structure which is not only made by single document, but it is the result of the composition of several multimedia resources. HTTP/1.0 introduced the notion of HTTP headers, then updated by and HTTP/1.1 which introduced some performance-oriented methods, such as keepalive, pipelining, caching and more. Recently the idea of evolving HTTP/1.1 has attracted researchers and industries which are nowadays proposing HTTP/2.0 within the IETF working group HTTPbis. The main motivation behind the need of evolving HTTP/1.1 can be seen in Figure below which clearly shows that the Page Load Time linearly decreases with Internet latency, while a further increase of the link capacity does not introduce any benefit in the Page Load Time reduction.

Page Load Time
High Performance Browser Networking by Ilya Grigorik

The HTTP evolution

Figure below shows a basic overview on the evolution of the HTTP protocol when downloading a Modern Web Page made of 5 resources as shown in the Figure. We show the benefit of HTTP/1.1 persistent connection and pipelining with respect to to HTTP/1.0. Then we show the HTTP/2.0 (SPDY) multiplexing and server PUSH and finally QUIC 0-RTT connection set-up.

'HTTP Evolution'


The Figure below shows the testbed employed to compare the performance of HTTP/1.1, HTTP2.0 (SPDY) and QUIC when they are used to download a simple Web page containing only jpeg images without any JavaScript code and css file.


HTTP server: Apache/2.4.10 with TLS 1.2.
SPDY server': nghttp2 with TLS 1.2. We have employed nghttp2 which supports SPDY 4 that is the Google implementation of HTTP/2.0.
QUIC server: (version 24) server in Chromium code base with QUIC-Crypto
Netshaper: sets channel capacity and propagation delays.

Real experiments using the open source browser Chromium.

Testbed settings in the scenarios below: Buffer sizes: Q = BDP - Base RTT: 50ms - Bottleneck Capacity: 3 Mbps - Loss free channel

Experiment with HTTP/1.1

Page Load time: 2.03s

- Six TCP connections are opened (each one requires a TCP and TLS handshake)
- One resource per time is transferred over a connection.

Experiment with HTTP/2.0

Page Load time: 1.45s

- One TCP connection is opened. All the requests are sent when the DOM is loaded.
- Resources are multiplexed and sent simultaneously over the single TCP connection.

Experiment with QUIC

Page Load time: 1.14s

- One UDP connection.
- No handshake required.

Waterfall diagram Legend