The State of JavaScript

How are we actually using JavaScript today? Which features are widely adopted, and which less so? What performance impacts is JavaScript really having on page load times?

In this presentation Houssein Djirdeh details how we’re really using JavaScript today.

The State of JavaScript

Houssein Djirdeh, Developer Advocate Google

JavaScript constantly evolves and the full JavaScript ecosystem would probably take a book to cover. So Houssein is going to focus on JS on the web – although it will still only be scratching the surface.

So what’s the data source for this talk? HTTP Archive tracks over 5m URLs on a monthly basisc. Note that only the home page is tracked, as the dataset is already massive.

httparchive.org

So how can you read the data?

1. Google’s BigQuery data warehouse tool – allowing any query without having to download and manage the (extremely large) raw data yourself
2. Preset monthly reports covering high level trends
3. Almanac – yearly analysis starting from 2019, with community input into the analysis

Houssein created the first Javascript chapter of the Almanac, with help from a lot of contributors. This talk traces the same topics with current/re-queried data from July 2020.

Almanac data comes from several tools:

1. WebPageTest a performance testing tool
2. Lighthouse a performance and profiling tool
3. Wappalyzer which detects the technologies used on a page

Some areas are also informed by the Chrome UX Report.

This talk focuses on the data from WebPageTest.

So let’s dive into some JavaScript stats. JS is the most expensive resource we send to browsers – it has to be downloaded, parsed, compiled and executed. While browsers have decreased the time it takes to execute, download remains expensive.

So much do we actually use? The biggest sites (90th percentile) send over a megabyte even when compressed for final transfer; and at the 50th percentile it’s still over 450k.

So it feels like we might send too much JS… but what is “too much”? That depends on the capabilities of the browsing device. We are sending much the same amount of JS to mobile as we do to desktop.

The impact of that is seen in V8 main thread processing times, where mobiles are significantly longer; and the gap gets bigger the more JS is sent.

Other interesting metrics:

  • number of requests shipped – with HTTP/2 more parallel connections can enable faster transfer by sending small chunks
  • first vs third party requests – most sites are making more requests to third parties than they are making requests to deliver their own code! This trend is backed up by data showing we also send more bytes of third party code.

Compression is a key method of improving download time. Roughly 65% of sites use Gzip, 19.5% use Brotli, but over 15% don’t use compression at all.

We can also look at feature usage.

  • Less than 1% of sites use script type="module"
  • 14% of sites ship sourcemaps in production (for at least one script on the page)
  • Market share of libraries and frameworks – React, Angular and Vue make up just 6.53% of sites in the archive, while a whopping 83.16% use jQuery.

If you want to dive specifically into the performance of web frameworks, check out Houssein’s project, Perf Track

The first edition of the almanac was a huge commitment, with 93 contributors. They are always keen to hear from people who are willing to help. Get in touch if that might be you!

All this just scratches the surface. HTTP Archive and the Web Almanac are gold mines of information and Houssein hopes more people will take advantage of them.

@hdjirdeh