We explain why your GTmetrix Performance Score may be different from other Lighthouse-based tools.
Overview
The GTmetrix Performance Score is generated using Google Lighthouse and should be broadly similar to the Performance scores reported by other web performance tools.
However, there can sometimes be considerable differences in the GTmetrix Performance Score compared to what you may see in the other tools. These differences are mainly due to various factors including Lighthouse implementation, testing methodology, testing location, etc.
While this article focuses on three major Lighthouse tools, the overall rationale extends to any tool that uses Lighthouse to produce performance data.
Here are some of the key differences between GTmetrix, PageSpeed Insights, web.dev, and WebPageTest.
Why does my Performance score vary from tool to tool?
In a nutshell, Performance scores vary because different tools use different methodologies and configurations.
Even though Lighthouse is the common component among the tools, there are considerable differences in how Lighthouse itself is implemented.
For example, a notable difference is the Google tools’ use of something called Lantern, which simulates your page’s loading behaviour under certain network conditions, and can cause significant differences in Performance scores. More on this below.
Based on our observations, here’s how each tool generates a Lighthouse report:
GTmetrix
Loads page with real browser (Chrome) |
WebPagetest
Loads page with real browser (Chrome) |
web.dev
Uses headless/emulated browsers |
Google PageSpeed Insights
Uses headless/emulated browsers |
Key Takeaway
In general, all tools have differences in hardware, connection speed, locations, screen resolutions, and test methodology.
GTmetrix uses a real browser to load your page with an Unthrottled Connection using our specific hardware and test options. Some of these options (location, connection speed, screen resolution, etc.) can be changed based on your requirements.
WebPageTest also uses a real browser to load your page, albeit with different default test options and configurations. Some of these options can also be changed depending on your requirements.
PageSpeed Insights and web.dev launch performance tests using headless/emulated browsers and use network throttling to simulate your page load under different network conditions (i.e., desktop and mobile). You cannot change the test options like location, connection speed, screen resolution, etc.
Moreover, the Google tools (PageSpeed Insights and web.dev) use Lantern, which results in significantly different page loading behaviours, yielding different Performance scores and test results.
¹ Note that WebPageTest offers two types of tests. Their default/standard test launches a real browser using the specified configuration, but uses its own calculation and measurements to capture the different Lighthouse metrics (i.e., Web Vitals) from the Chrome trace data. You can also run a dedicated Lighthouse test from within WebPageTest, whose results will different from their standard test.
² Note that we’re assuming Google uses default Lighthouse in their tools. However, it is possible that modifications were made to suit their needs.
³ Note that we’re assuming the Google tools launch tests from these locations based on this communication.
Implementation Differences
There could be several implementation differences among the different tools, including browser version differences, Lighthouse version differences, time to stop the test, etc.
As some of the performance metrics are very new, the scoring algorithms are continuously being tweaked, which often get rolled out with newer browser versions. This could yield different results on each tool.
Each tool may also stop the test at different times, which could result in things like missed requests, different metrics, and overall, different results.
For example, if the page doesn’t finish loading after a certain amount of time, Lighthouse stops the test and finishes the analysis. The Google tools timeout if First Contentful Paint (FCP) is not obtained after 15 seconds and if the page doesn’t load within 35 seconds; GTmetrix waits 30 seconds for FCP and times out if the entire test takes more than 2 minutes.
While the above factors can cause discrepancies in your Performance scores, there is another chief implementation difference that could impact your Performance scores with each tool.
Some Lighthouse-powered tools use Lantern to speed up test execution and page analysis. This is a different approach to loading pages as it simulates load behaviour as opposed to measuring real-world performance with actual hardware specifications.
What is Lantern?
Lantern is an algorithm that initially loads your page as fast as possible (i.e., Unthrottled CPU/Network) and then applies something known as simulated throttling.
This essentially simulates what the actual page load would have been under given CPU/network conditions.
It’s an effective model for Google and, for the most part, provides representative data while speeding up test execution. However, in some cases, it may not accurately represent real-world conditions.
How does Lantern affect test results?
In the case of the Google tools, your page is initially loaded as fast as possible likely using powerful server-grade hardware.
The tool then uses Lantern to work backwards and applies simulated throttling to modify the results as if your page was being loaded on an “average” device.
This is known as “Simulated mode” and is significantly different from our implementation; GTmetrix loads pages in real time using specified device/connection options from the start and does not work backwards to modify the results.
In other words, GTmetrix generates results based on the observed performance at the time of the test. This method of operation is known as “Observed mode“.
Lantern ignores the observed performance and runs in Simulated mode, which may significantly affect your test results.
Lantern greatly shortens the time taken to generate a Lighthouse report as it gathers the data while the page loads.
However, this simulation of the browser execution may not be entirely accurate (compared to a real page load) and could result in significant differences in the various metrics on the Google tools compared to GTmetrix.
Learn more about Lantern here.
Do not compare them directly!
Keep in mind that there is no equal comparison as there are many differences among the tools, in both, implementation, as well as, scoring.
We do recommend using the other tools to do a spot check and for an overall idea of how your page is performing, but remember that Lighthouse alone is the common factor among all the tools, and each tool implements Lighthouse in its own way.
The goal should be to understand what the differences are with each tool and adjust your expectations accordingly to gain an accurate picture of your web performance.
Our recommendation is to always monitor your page frequently so that you can keep track of your performance and take appropriate action when performance issues crop up.
Detailed Differences
While we’ve summarized the key differences above, you may be interested in a more detailed explanation for the Performance score differences. Continue reading for more specific details on each tool.
Hardware Differences
Different tools use different hardware specifications, which results in different loading behaviours of your page.
Here are the key hardware differences among the different tools:
GTmetrix
Has particular hardware specifications |
WebPagetest
Uses mostly Amazon AWS EC2 servers (Specifications unknown) |
web.dev
Unknown hardware specifications |
Google PageSpeed Insights
Unknown hardware specifications |
Key Takeaway
Hardware specifications will play a major role in your Lighthouse results. More powerful hardware means more resources to efficiently load your page. This helps your browser execute tasks faster (i.e., CSS/ JavaScript parsing and execution, rendering, etc.).
GTmetrix is very selective with respect to server requirements so that we can ensure a consistent performance benchmark across all locations.
WebPageTest, uses mostly Amazon AWS EC2 servers, however the hardware configuration is unknown.
The Google tools (PageSpeed Insights and web.dev) likely use powerful server hardware and then apply appropriate CPU throttling for their page analysis results.
PageSpeed Insights and web.dev also use Lantern, which can have a significant impact on Performance scores as the tools are set up to load your page as fast as possible and then simulate loading behaviour under certain network conditions.
Differences in CPU and Memory resources can particularly affect some Performance Score metrics like Time to Interactive (TTI) and Total Blocking Time (TBT) as they are CPU-focused (i.e., they rely on JavaScript execution).
Test Locations
The location used to conduct the tests can differ among the tools, which affects latency and other network connectivity related factors.
Here are the location differences with respect to each tool:
GTmetrix
Vancouver, Canada (default) |
WebPagetest
Dulles, USA (default) |
web.dev
Tests likely launched from US servers |
Google PageSpeed Insights
Tests likely launched from one of 4 locations |
Key Takeaway
Test Locations can make a big difference to the Performance scores due to factors like distance from server (i.e., latency), network connection quality, etc.
GTmetrix and WebPageTest allow you to pick your test locations so that you can analyze web performance based on where your visitors are located. Both tools offer a number of different locations for you to choose from.
PageSpeed Insights and web.dev do not allow you to choose the location. PageSpeed Insights likely launches test from one of 4 global locations depending on the user’s proximity to those locations while web.dev likely launches tests from their US servers.
Connection Speeds
All tools are likely to have connection speed differences, which impact the time taken to load your page. There may also be further implementation differences that affect each tool’s connection speed and results.
Here’s how each tool differs with respect to Connection Speeds:
GTmetrix
Uses an Unthrottled Connection (default) |
WebPagetest
Uses a Cable connection (default) |
web.dev
Uses Simulated throttling |
Google PageSpeed Insights
Uses Simulated throttling |
Key Takeaway
Basically, faster connection speeds are likely to load your page faster, producing higher Performance scores.
GTmetrix, by default, uses an Unthrottled Connection but allows you to change the connection speed, based on your requirements.
WebPageTest, by default, uses a Cable connection and also allows you to change the connection speed, based on your requirements.
Web.dev runs emulated mobile-only tests using (what Google calls) a slow 4G connection.
In the case of PageSpeed Insights, the desktop and mobile tests are separated using their respective connection speeds.
Additionally, both Google tools use Lantern to simulate your page’s loading behaviour as if it were loaded on a mid-tier device/network.
Note that you cannot change the connection speed on either Google tool.
Real browser vs Headless/Emulated browser
Different tools can have fundamental differences with respect to the browser used to load your page. Consequently, loading behaviours may vary, resulting in different Performance scores.
Here’s what each tool uses, with respect to the browser option:
GTmetrix
Loads page with real browser (Chrome) |
WebPagetest
Loads page with real browser (Chrome) |
web.dev
Uses headless/emulated browser |
Google PageSpeed Insights
Likely uses headless/emulated browser² |
Key Takeaway
Browser differences impact your page load in many ways, which can result in different Performance scores.
GTmetrix and WebPageTest use real browsers to load your page. This would be equal to the Chrome browser you’ve installed on your desktop.
PageSpeed Insights² and web.dev use headless/emulated browsers, which are essentially browsers without a user interface.
A real browser is the closest representation to a real-world user visiting your page. A headless/emulated browser, however, uses a script to load your page and capture the resulting loading data.
This could result in different loading behaviour and results for your pages.
¹ Note that you can only change screen resolution on a private instance of WebPageTest.
² Note that we’re assuming PageSpeed Insights uses an emulated browser but it’s not clear if this is the case.
Related Reading
You can read more about the new GTmetrix platform and other associated changes in the following articles:
- Everything you need to know about the new GTmetrix Report (powered by Lighthouse)
- GTmetrix PRO Plans Explained
- Why is my Performance Score always changing?
- I was scoring well with the Legacy GTmetrix before but now my grades have dropped. Why?
- Glossary of Web Performance Terms
Test with different countries, speeds and options
Get access to more Test Locations, Analysis Options and Connection Speeds!
Sign up for a Basic GTmetrix account and see how your site performs in more scenarios – It’s FREE!