If you run a website such as a blog, this discussion probably sounds familiar: You got a message in your inbox, usually from a Big Tech company, informing you that your site is slow, and that it is pushing customers away.
Over the past decade or so, Google has turned this mindset into an out-and-out philosophy, one that it has forced upon its users, who rely on Google for a significant portion of their time and traffic.
This is reflected in an article the company published this week that suggested that its Core Web Vitals discipline saved people a collective 10,000 years of waiting.
“The average page load in Chrome is now 166 ms faster,” the company wrote. “That might seem like a minor improvement, but small changes can accumulate to create a substantial impact on the web.”
This need for speed is not inaccurate, at least at a high level. Legendary usability expert Jakob Nielsen noticed nearly 30 years ago that we like our sites fast:
Every Web usability study I have conducted since 1994 has shown the same thing: users beg us to speed up page downloads. In the beginning, my reaction was along the lines of "let's just give them better design and they will be happy to wait for it". I have since become a reformed sinner since even my skull is not thick enough to withstand consistent user pleas year after year.
At that time though, most users were viewing websites via modems barely fast enough to download a mislabeled song supposedly performed by Phish. When Nielsen re-assessed his research at the height of the broadband era in 2010, he noticed that speed remained an important factor:
Today, most people (in some countries) have broadband, so you might think that download times are no longer a usability concern. And yes, actual image download is rarely an issue for today's users (though images can still cause delays on mobile devices).
Still, response times are as relevant as ever. That's because responsiveness is a basic user interface design rule that's dictated by human needs, not by individual technologies. In a client usability study we just completed, for example, users complained that "it's being a little slow."
It was only a matter of time before Google would decide to turn page speed insights into PageSpeed Insights, a developer tool to determine website speed. That further evolved into what Google calls Core Web Vitals, essentially a vitamin list for creating a good website.
If you find weird or unusual topics like this super-fascinating, the best way to tell us is to give us a nod on Ko-Fi. It helps ensure that we can keep this machine moving, support outside writers, and bring on the tools to support our writing. (Also it’s heartening when someone chips in.)
We accept advertising, too! Check out this page to learn more.
The three elements that Google cares about the most—First Input Delay, Largest Contentful Paint, and Cumulative Layout Shift—are terms that they came up with to determine strong web performance. These have shifted slightly over the years, but they each ultimately touch on the same goal—creating a paramount experience for users.
The problem is, though, that Google’s influence on people’s financial outlooks and scalability means that they often have an effect of adding a lot of work onto already overworked development teams—particularly small ones. Google tried to simplify this to some degree through its AMP framework, but publishers rightly complained about this attempt to contour the shape of the internet in Google’s image. So Google chose another route—a route of constant light pressure to make better websites, which some might see as motivating but also feels a bit infuriating.
Personally, I have at times spent hours trying to speed things up over the years, and I have determined that my problem is not that Google wants me to speed up my website. It’s that Google wants me to spend lots of money purchasing a new content-management system, pay more for server infrastructure, and move away from the LAMP stack to something that they more readily can financially benefit from. (Conspiratorially, it seems like they want to murder PHP-based content management systems.) Because, let’s face it, Google benefits from this transaction, whether directly (by me purchasing cloud storage from them) or indirectly (by me making their code run better).
I mean, think about it. WordPress is a technology that you can host on a shoebox. You do not need anything more than it, or anything less, but we have spent the last 15 years trying to top it, breaking our backs to squeeze one additional dollop of performance out of this thing.
But the true secret of this, the thing Google won’t mention, is that all this work trying to convince us to get our websites ready to run a digital marathon, it’s essentially to fix their busted code. I have largely moved away from Google-driven services such as Google Fonts and Google AdSense because they are the very things Google complains about most. If you put a YouTube embed on your website, it’s like you served up a bowl of refrigerated grease to your users, per Google’s own metrics. But if you use a tactic to try to speed up Google’s crappy code, a bunch of schema that they require for these videos to benefit your SEO just disappears.
Now, I’m not trying to be a stick in the mud here. I realize innovation in web development is a good thing and benefits everyone. But I also think people just self-hosting a website shouldn’t be forced to do any of this stuff just to keep a presence on modern-day Google. For every second of load time saved, that may be an entire work day of stress taken on by someone who just wants to focus on the content.
And the thing is, 30 years into this grand experiment we call the mainstream internet, are we really any better off for all this forced digital dieting we do for the sake of a company that at the end of the day just wants our credit card number?
When it comes down to it, we’re just throwing stuff on the wall to see if it sticks, and a company with a vested interest is giving us a vague target.
I’d like to think of building an effective website as an attempt to put Jell-O in a sack. Admittedly, just dumping Jell-O in a sack on its own is cumbersome—but that’s how it worked when we started building stuff on the internet. We made a goddamn mess trying to dump these goopy rendered animal byproducts everywhere.
Soon enough, you realize that something will inevitably leak out, so you’re stuck having to buy a better sack, preferably one made of canvas. Eventually, you figure out that, hey, maybe you should organize this Jell-O into individual containers so it’s easier to manage. But it takes time to put all that Jell-O in all of those containers.
And then you wonder, well, why can’t we automate the process of putting Jell-O in those containers, so it’s faster to do? That costs more money, but you can always optimize the process. But then you need a better sack because the one you’ve been using was designed to only carry so much Jell-O. And ultimately, you realize, hey, maybe I didn’t need the sack at all. Maybe I was better off just throwing piles of raw Jell-O at random people on the street.
This is the strategy we’ve been using to develop websites over the past 30 years, a process of constant iteration with the goal of just trying to get some pixels on a page while possibly squeezing out a few nickels out in the process. We are all trying to carry our sacks of Jell-O across the finish line in the cosmic three-legged race that is online content, and this gigantic company keeps changing the rules as if we don’t have enough stupid stuff on our plates trying to keep this Jell-O from spilling out on the ground everywhere.
Why do we put up with it?
Slow Links
Homer stopped strangling Bart, and it took us four years to notice.
Today in journalists doing it for themselves, I point to Flaming Hydra, the latest cool idea from the Brick House Collective’s Maria Bustillos.
Enshittification is even coming for the small-scale sites that everyone likes. Case in point? Discogs.
--
Find this one an interesting read? Share it with a pal! And see you tomorrow!