Elon Musk's X continues to delay links to websites he dislikes

Elon Musk's X continues to delay links to websites he dislikes

Yesterday's big news was that X is delaying links to its competitors by up to 5 seconds. It was claimed to be resolved, but I can confirm that not only was it occurring before, but with a delay of 2.6 it is still occurring now. Let's dive in.

Sites that are (still) slowed down by X

The Washington Post posted about Elon Musk's X (formerly Twitter) throttling traffic to websites he dislikes on August 15, 2023, at 1:27 p.m. EDT. And the following sites were affected:

  • Facebook;
  • Instagram;
  • Bluesky;
  • Substack;
  • Reuters wire service;
  • The New York Times.

According to The Washington Post, all of them have previously been singled out by Musk for ridicule or attack. After they published this story, the delay was reverted. So there is no way to do a post-mortem research.

Which means we have to trust the source. But we can shed a bit more light on this.

How Twitter outbound clicks work

My in-depth analysis of Twitter's t.co redirect behavior can be read in an earlier article. But the summary is as follows:

  • bots will get a straight line to the destination (301 redirects) when following t.co links;
  • users -unknowingly clicking on t.co links in their interface- will receive an HTML page containing a JavaScript redirect.

Does X, formerly Twitter delay outbound link clicks?

Well, yes and no. Some characteristics changed by now. But for the majority of outbound links, the response times were often below 200ms in my tests. That's quite fast.

Check the t.co response time

I could set up all kinds of tests to measure the response time. But the t.co redirect service isn't very secretive about the response times of t.co itself. You can see that in the real user versus bot screenshots of my previous t.co article.

I found it surprising that t.co is exposing this information themselves as well.

And regardless of being a user or a bot, the response times are quite the same. Well, not relatively speaking as 181ms for real users is quite a bit higher than 105ms for bots. But that's nowhere near a second difference, let alone a 5-second delay The Washington Post is talking about.

Rivals of Elon Musk experience a 2.6 seconds delay, not 200ms

But this is where things start to become very interesting. Elon won't consider me nor my website as a competitor. But what about Bluesky (https://bsky.app/), Substack (*.substack.co) and Mark Zuckerburg's Threads. The latter is considered a competitor given the fact that a cage fight between the two was announced.

Throttling competitors is still happening today

So, I did the test and all of them are showing a response time of more than 2.6 seconds. Even today. And both in the case of bots as well as real browsers. As I was able to continuously reproduce the same results, I ended up only taking screenshots of a test with threads.net:

2619ms response time as a real browser

2626ms response time as a bot

Mark Zuckerberg's Meta brands Instagram and Facebook are also still affected.

So, what about the 5-second delay?

Well, it's not happening anymore. This can be confirmed by testing a t.co equivalent of the Washington Post article. For example, when testing https://t.co/4CM1q2O6OS using httpstatus.io, there is no substantial delay (anymore).

Can confirm: it did happen

But based on several tests by users on news.ycombinator.com, it looks like it did happen before. And looking at the characteristics of what was happening, it doesn't seem like a hiccup at all, and most likely an intentional delay.

Visual proof

And this is the only visual proof (copies here of nytimes.com and here of gov.uk) I could find in those threads. Notice the 4.68s for nyti.ms (New York Time's link shortener service) and 144ms for gov.uk. Additional proof is in another format as well (a direct link to the log can be found here).

However, the flow was different than the scenario discussed above. In the case of Threads, Bluesky et cetera, the delay is happening for everyone. So, all user agents.

Based on reports from news.ycombinator.com, the 5-second delay was only happening when a real browser was used. So it even seems as if Elon Musk applied a different delaying mechanism for these sites than the mechanism that is still happening for Threads, Bluesky et cetera.

Why only delay real users?

But why would one use yet another delaying mechanism to differentiate between bots and users, when there is a throttling mechanism in place already?

Once again, only X might know. However, only doing it for real user agents might help the throttling to go unnoticed:

  1. this would not be detected during automated (bot) tests, as those would be redirected with a 301 as usual;
  2. and it also won't be noticeable in the Core Web Vitals data of any of the affected sites.

Regarding number 2 and as explained in my redirect technique behind Twitter t.co links article, the TTFB for the destination URL (for example washingtonpost.com/) would only start to measure the TTFB as of the moment of JavaScript redirect. This would exclude any server-side delay by t.co itself, as t.co's HTML would be loaded first. So, no TTFB delay is attributed to the destination page.

However, the t.co page would stay white for a while, so it does impact the perceived performance nonetheless.

Debunking other theories

On news.ycombinator.com we can spot some doubts regarding the delay being intentional, as well as other aspects that could be skewing the results. Let's address them:

  • "Any DNS resolver libraries have a 4.5-second timeout? Maybe their infrastructure is just rotting"
    There are too many people on news.ycombinator.com talking about the 5-second delay, making it unlikely that DNS is acting up that often in all those tests by other users;
  • "Is there some cache going on?"
    Yes, actually. Viewing the response headers, you'll see that browsers are allowed to cache the page for a short duration (spot the max-age in the response headers). So:
    • whenever re-clicking on the same link, the t.co page will be fetched from your browser cache.
    • But when testing as a bot, there often is no browser cache, so each new test would force a fresh download of the t.co page from the t.co server. This means such a test would always spot such a (server side) delay if it's there.
  • "No, because it’s not an HTTP redirect. It’s an HTML page that redirects you using a meta tag, something that the browser doesn’t cache."
    The above is the answer to someone asking if the redirect could be cached. Browsers typically do this for server-side HTTP 301 redirects, but not for meta tags or JavaScript redirects.
    However, browsers are capable of caching HTML files. Next to the answer to the previous caching question, my very own website that you are visiting now sends headers to the browser as well to allow it to cache HTML pages for a short amount of time. When opening your dev-tools Network panel, one will also be able to see that a second time that you visit the same t.co link, the browser was able to serve it from the browser cache.