This website will not empower you.

Sustainable Reporting - Emissions

The author, Daniel Hartley
Daniel Hartley
Reading time: 4 to 5 minutes

Tracking carbon emissions in end-to-end tests.

Measuring carbon emissions associated with websites and apps is in its infancy but there are benefits to doing so.

The first is an appreciation of how emissions relate to bytes transferred.

The second is seeing how emissions fluctuate in response to changes in design and code. This can be done by recording emissions during end-to-end (E2E) tests.

Emissions per byte

In order to measure carbon emissions for this website, I'm using CO2.js from The Green Web Foundation (GWF). The simplest call to their API needs only a byte value.

If the site is hosted on servers running on renewable energy the emissions will be lower. The GBF provides a helper function for checking if a site is green hosted.

Here is an example for a website of median page weight.

import { hosting, co2 } from "@tgwf/co2"

const green = hosting.check('').green
const bytes = 2299
const emissions = new co2().perByte(bytes, green)
Emissions Comparison Green Hosting
Green web hostCarbon dioxide emissions
Yes~783 mg/CO2
No~905 mg/CO2

End-to-end testing

In order to record requests and calculate emissions, I created a simple helper class EmissionsTracker.

A call to the instance's single public method returns a summary of page metrics relevant to its emissions including:

  • The number of requests
  • Whether the site uses green hosting
  • The grid intensity (either calculated from the request or set manually)
  • The total transfer size of requests in kilobytes (kBs)
  • How long the page took to load in milliseconds (ms)
  • Emissions in milligrams of carbon dioxide

These values can be persisted and used to monitor the effect of code or design changes.

I also created a test that can report on any website (it simply loads the given page). By default I scroll to the bottom of the page which may give a more honest account of what is happening.

node emissions-tracker/emissions-by-url.js -u -v -lh

// -u website url or domain
// -v: verbosity
// -lh: to also run a Lighthouse report

The test environment is Chrome under the control of Puppeteer.


Here are the results from a few sites including this one.

The Green Web Foundation
The PG
The Public Good
The Guardian
The Guardian newspaper
Emissions Tracker Summary
The GWF24533938137 (45)
The PG8634398399 (35)
The Guardian11022911533897 (828)
iNaturalist5927983906771 (664)

Here is a comparison with the results of some popular online carbon calculators. All the values given using the Emissions Tracker are based on an automated scroll to the bottom of the page. The emissions values without scrolling are given in brackets. This distinction is important and one often ignored or not explored.

The GWF137 (45)501440
The PG99 (35)90960
The Guardian897 (828)10692961060
iNaturalist771 (664)570259630
ET: Emissions Tracker, EC: Ecograder, CN: Carbon Neutral Website, WC: Website Carbon

We can compare the bytes transferred value of the Emissions Tracker (ET) with Chrome DevTools (DT) and Lighthouse (LH).

N.B. Figures for Lighthouse are low for requests because only the page above the fold is analysed. When I run the Emissions Tracker, I scroll smoothly to the end of the page.

Bytes transferred
The GWF533508141
The PG343224169
The Guardian229122002259
ET: Emissions Tracker, DT: Chrome DevTools, LH: lighthouse API

The number of requests.

Request count
The GWF242123
The PG865649
The Guardian110114123
ET: Emissions Tracker, DT: Chrome DevTools, LH: lighthouse API

And load time.

Load time
WebsiteLoad (ms)
The GWF9381002983
The PG9838771345
The Guardian153312401480
ET: Emissions Tracker, DT: Chrome DevTools, LH: lighthouse API

There are discrepancies in the results. There are discrepancies between runs using the same measure. One of the largest - an order of magnitude - is between the results given by the Carbon Neutral website and other calculators. The only calculator that gives similar results is GreenFrame but that requires local installation or a subscription. Another site, ecoIndex, gives values an order of magnitude higher than the average. For example, for the Public Good, the emissions are calculated to be 1620 m/CO2.

Until there is consensus and certainty around numbers, I think we should be careful publishing them. It is too often the case that a single figure gets picked up and repeated endlessly, as happened with carbon emissions attributed to watching Netflix.

Living information architecture

The greatest benefit to me from measuring bytes and emissions was that I paid more attention to code running where others see and interact with it - in the browser.

The Information Architecture (IA) of a web site and how individuals navigate it affects bytes transferred and processing time and resources.

For example, this blog preloads linked pages. The home page has a lot of internal links which significantly increase its page weight (a mix of the number of bytes transferred and the number of requests).

But if you click on a visible link (above the fold) to internal content, you will see that page loads almost instantly with very few bytes being transferred.

I also cache pages. If you return to a page it is served from a local cache and the only network traffic will be to third parties (such as cabin analytics).

The effectiveness of this strategy depends on how people use the site; whether they move between pages, or indeed whether they read more than one article in a single session. It's quite possible they won't read more than one but they may jump from page to page.

To observe this behaviour, open the Network tab in the developer tools of any browser.


Recording performance and sustainability metrics during end-to-end tests in (or similar to) the production environment is a way for developers to get closer to the experience of people using their site. It is more meaningful to engage with the living information architecture in the browser than dead artefacts.

Emissions tracking and digital sustainability in general cannot be viewed, however, in isolation, but should be considered alongside accessibility, performance, ethical factors and security.

And whilst some comparison with similar sites is useful, most is gained by observing change within a site. The best feature is sometimes the one that doesn't get built.

A snapshot of a page or even the entirety of a site is insufficient to judge its merit. The platform for recording observations of nature, for example, iNaturalist, scores poorly in some respects but the site has changed little in fifteen years, a testament to good information architecture and clear aims.


Knowing how and when bytes are transferred and emissions accrued helps developers make good decisions about site architecture, especially as a site is modified and extended and code is refactored.

I was less convinced displaying emissions would be useful for site visitors until I checked what happened when I scrolled through the posts on my Facebook home page. Since posts are added quicker than I can scroll, this is literally infinite. The initial page load was about 9MBs. After one minute of scrolling, I had downloaded:


I would like to see native emissions counters in browsers that aggregate emissions across sites. Whether this would be over a session or time interval would be up to us, as would the option to set a budget or cap on emissions. It would certainly help highlight the deleterious effect of devious and deceptive patterns like infinite scroll and video autoplay.

Finally, emissions reflect only a fraction of a website's impact on its environment. A full Digital Life Cycle Assessment (DCLA) would be needed to take into account water and land usage and adverse effects on people and nature to name only a few considerations.

Published: Fri Jun 14 2024


Content length

When the content length is unavailable, I use the response byte length. However, this is the uncompressed value, and compression ratios are variable.

In order to compensate, I set default ratios for CSS (6), JS (2) and Other (5). These values can be overridden using command line variables.

Performance API

I initially used values from the Performance API (a more recent alternative is the PerformanceObserver API) but this returns a value of 0 bytes for requests to third parties.

When CORS is in effect, many of the timing properties' values are returned as zero unless the server's access policy permits these values to be shared. This requires the server providing the resource to send the Timing-Allow-Origin HTTP response header with a value specifying the origin or origins which are allowed to get the restricted timestamp values.

Links to external references

Related content