Instant noodle soup is not the best example, but would you trust shellfish that would've been in boiling water for just 100 milliseconds? Well that's where I had to change things to give users a trustworthy feeling.
I've built a simple (but back then a bit slow) Core Web Vitals check via my homebrewed chatwidget (to not impact pagespeed), but felt like adding more information, such as:
- Direct link to your domain's historic results as tracked by Google;
- Clear indication of individual Core Web Vitals metris;
- Current overall Core Web Vitals assessment status;
- Return on investment calculator.
Speeding up API requests
But I also wanted to speed things up, as a check took around 30 seconds. For this, I started using Google's Chrome UX Report API instead of their PageSpeed Insights API. I didn't need lab data results after all, only the latest real user experience results.
Response times for API requests basically dropped from around 10 - 30 seconds, to maybe 100 milliseconds. And as I also cached responses to reduce the amount of API request within 48 hours interval, overall response times were quite optimal.
User feedback: fast response times feels suspicious
That's where feedback started to come in. While speeding up websites and webshops or writing about it is what I do on a daily basis, the achieved result for this optimization kind of backfired.
Here is how the Core Web Vitals and ROI calculator works:
- You fill in your domain name;
- Results will pop in quite fast, after an API request was made to fetch your domain's results using Google's API.
This looks as following:
But what if you would only see the loading indicator as shown in the first image, during a period of 100 to maybe 300 milliseconds? When results would be return right away, this might feel suspicious. Not that I was aware of it, but it turned out to be what users experienced when submitting their domain names. So I knew I had to change this.
The need for delayed requests to improve user experience
We should normally strive to get great pagespeed and performance results. Not only when using PageSpeed Insights or GTmetrix, but also towards real user experience as becoming part of Google ranking via Core Web Vitals.
But even in real life, the need for speed depends on the context as well when it comes to UX. Basically as following:
- users expect to have a good experience when navigating through a website or webshop;
- they should not get annoyed by jumping elements (CLS) or unresponsiveness (FID, within 100ms is considered instant), which might reduce user's temper and trust.
New to Core Web Vitals? Read about the Core Web Vitals metrics and its FAQ, questioned gained from doing training sessions and consults.
However, in case of some transactional requests, you might want to build in a delay (unless the request is slow by itself) to prevent suspicious feelings. In the case of my calculator:
- Someone got the feeling that I was actually making up the numbers myself.
- They even thought it was just me how told them if they actually were passing Core Web Vitals assessment, despite the fact that I was using public Google data.
Delaying the script request
So I had to address my script's handling of fetching API results or just returning a cached response. Moreover, I always wanted it to be around the same waiting period, in my case 1.5 seconds. This is how I went about the problem:
- In case of a valid and non-outdated cached response, I could just use PHP's usleep function and pause it for 1.5 seconds;
- But when an API request had to be made, I can't know in advance how much time these requests will take;
- So I now just track the needed amount of time it took to do the API request;
- And if it doesn't exceed 1.5 seconds, I just substract the time needed from 1.5 seconds, and do a customized usleep;
This looks as following:
.
Happy tweaking! Got questions? Let me know!