You're on a quest to achieve green results as Core Web Vitals is coming closer. Can PageSpeed Insights be deceived to rank better?
I could have used "Do you need a high Lighthouse score for optimal Core Web Vitals" as a title as well. But lets be honest: the current one is way more clickbait!
In case you did not know yet, Core Web Vitals wasn't only recently introduced, Core Web Vitals is becoming a ranking factor as well. As of May 2021 to be more precise. All the reason to get green pagespeed results, right?
How to fake PageSpeed insights for better ranking
The best way to cheat and mislead PageSpeed Insights is by looking at their user agent string and serving different code. You could, for example, just ship a screenshot instead of any code. No one will notice when running a PageSpeed Insights test.
If someone would then use different tools, such as Lighthouse, Webpagetest, Pingdom Tools, GTmetrix to measure the PageSpeed score, outcomes may suddenly be different and you would be exposed. Be sure to deceive user agents of all testing software out there. But that's not doable.
Change the code instead
However, you could still serve your webpage, but just with slightly different code. You would not be the first doing so.
Although just for their cookie-widget, CookieBot.com is doing exactly this. As a website or webshop owner using CookieBot, you have to include a JavaScript file. This JavaScript file will load a new file which would show the cookie-dialog. But only after it did some user agent sniffing, as can be seen in this screenshot as part of this LinkedIn post. This means, CookieBot will never load the second JavaScript file in case the website embedding CookieBot is being tested. Clever bastards!
More recently, someone approached me on LinkedIn, as they wanted to share their achievements and wanted my opinion:
Fortunately, we found a way to reach 100 of 100 at the Google Page speed test service with the project we develop right now.
How do you like it?
Someone in a LinkedIn DM
But I guess they thought I was just a consultant with theoretical knowledge, instead of being a developer by origin (and still being a developer today).
Faking the PageSpeed score won't help Core Web Vitals ranking
You'll find the PageSpeed Insights results of the website that they shared with me below.
In case you've got images turned off or are using a screenreader, this is what we see:
- Overall PageSpeed Insights score of 100%;
- First Contentful Paint of 1.0 seconds;
- Speed Index of 1.2 seconds;
- Largest Contentful Paint of 1.7 seconds;
- Time to Interactive of 1.0 seconds;
- Total Blocking Time of 0 milliseconds;
- And a Cumulative Layout Shift of just 0.2%.
All green results. Being a mobile result and using Next.js, quite awesome, right? That's what I thought as well. But based on what's coming next, I think I did good being discreet and anonymizing the URL.
Being triggered by such results, I compared the amount of resources with and without testing via PageSpeed Insights. This is what I found out when the page was being tested via PageSpeed Insights:
- the webpage would not load its fonts.css, so also no fonts;
- Google Tag Manager snippet was not embedded;
- Moreover, even their own first party JavaScript code was not embedded.
No point in gaming the PageSpeed Insights score
Here is the catch: Your pagespeed score doesn't matter, at all! The overall PageSpeed Insights score and especially the individual metrics obviously are important to get an idea of the performance hygiene and potential UX impact of your page's frontend architecture.
But if you fake it while trying to make it, there is no value in your pagespeed score in case the objective isn't to trick stakeholders (and why would you do this with the risk of having them disappointed when real UX metrics gets out).
Real user experience matters
Why, you ask? Because in the end it is about real user experience. I like to think so, but especially Google likes to think so. The ranking update as part of Core Web Vitals will be based on your real user experience data. So, the following is what I saw as well:
That's right, quite some red bars and orange figures. To be more precise:
- The FCP metric is 2.6 times as bad in real life for the 75th percentile. Only 12% of users are experiencing an FCP of less than 1 second;
- The LCP metric is 3.6, still two times as bad in real life. Slightly more than half of the visitors (55%) are experiencing an LCP of less than 2.5 seconds;
- The First Input Delay isn't to good either. The lab data shows that there couldn't be much JavaScript going on. This was actually correct as JavaScript was omitted. But a FID of 126 milliseconds means that at least 25% are still having major issues with the interactivity of this webpage, meaning the website was still busy executing JavaScript when the user tried to interact with the webpage.
Conclusion
I think we can keep it very short: you can't fake PageSpeed Insights and mislead the ranking advantage, unless you would do the same for real users while still maintaining website or webshop functionality. But I guess it would then be a best practice instead, if everyone benefits from your little tricks!
Do note that these differences aren't always the result of deliberately gaming the score:
- When using a plugin to measure the LCP, chances are it will be different depending on the device you are using. I pointed this out in a LinkedIn post covering the Web Vitals browser extension;
- Lighthouse is using a different setup than all of your users, leading to discrepancies as well.