Tedium.

 About /  Archives /  Sponsor Us
The Sky Is Falling, The Web Is Dead The Sky Is Falling, The Web Is Dead Shuffle Support Us On Ko-Fi
Share This Post:
 ShareOpenly Share Well Share Amazingly Waste Pixels

The Sky Is Falling, The Web Is Dead

In looking for examples of people calling the Web dead, I learned that, apparently, you can say the Web is dying for 30 years and get away with it.

By Ernie SmithOctober 25, 2025
https://static.tedium.co/uploads/BrokenGlobe.gif
#the web is dead #web history #web predictions #bad predictions #forrester #george colony

Jeffrey Zeldman, one of the OGs of web design, recently decided to weigh in on a debate that’s been picking up lately: With AI on the rise, is the Web dead? After all, that new OpenAI browser seems to be built for another era of internet entirely.

Zeldman’s critique is simple, and one that I can definitely appreciate: People have been declaring the Web dead as long as it’s been alive (and the comments have been hilariously wrong). I’d like to take a moment to consider one specific naysayer: George Colony.

Colony’s name may not ring a bell if you’re not in technology spaces, but he is the founder of Forrester Research, one of the largest tech and business advisory firms in the world. If you’re a journalist with a story and need an analyst, you’ve probably talked to someone from Forrester. I’ve talked to Forrester quite a few times—their analysis is generally quite sound.

But there’s one area where the company—particularly Colony—gets it wrong. And it has to do with the World Wide Web, which Colony declared “dead” or dying on numerous occasions over a 30-year period. In each case, Colony was trying to make a bigger point about where online technology was going, without giving the Web enough credit for actually being able to get there.

img (5).jpeg
Perhaps the first time someone talked about the Web being dead in print, circa 1995. (Berkshire Eagle/Christian Science Monitor/Newspapers.com)

The ’90s: The web is dead because it’s not interactive enough

Colony’s first anti-Web rumblings came around 1995, when his commentary was referenced in a Christian Science Monitor article:

Another critic, technology analyst George Colony, has focused on the graphical part of the Internet, the World Wide Web. He pooh-poohs the system, saying “The Web is Dead,” because it is not very interactive.

His quotes are referenced in context with two related haters—Clifford Stoll, author of Silicon Snake Oil, and Paul Saffo, then the director of the Institute of the Future. Each had deeper underlying points. Stoll criticized its social failings; Saffo suggested the Web was too one-directional. Colony, meanwhile, just felt the Web was too static. Computers could do more.

Colony was one of the few beating this drum. While Saffo and Stoll kind of faded into history a little (though Stoll’s commentary has modern-day defenders), Colony kept it up.

WebDead1.jpg
“In the future, we will all be Microsoft,” he says. Yikes. (NetworkWorld/Google Books)

In 1997, the Forrester founder was quoted in Network World, implying that the Web was just not the horse to get us to the next stage of digital nirvana. Instead, he implied that Java would take over online. (See, if he had said JavaScript, he would have been right. But plain-jane Java? Not so much. As far as consumer-facing experiences go, Flash ate its lunch.)

In another piece during the same period, he made clear that the problem he saw was that the Web just didn’t go far enough. “We will not reach an Internet economy of this size with today’s technology,” he said at a company-sponsored event, according to The National Post.

Can you see the underlying fault with his commentary? He basically assumed that Web technology would never improve and would be replaced with something else—when what actually happened is that the Web eventually integrated everything he wanted, plus more.

Which is funny, because Forrester’s main rival, International Data Corp., essentially said this right in the piece. “The web is the dirt road, the basic structure,” IDC analyst Michael Sullivan-Trainor said. “The concept that you can kill the Web and start from square one is ridiculous. We are talking about using the Web, evolving it.”

… You?
Sponsored By … You?

If you find weird or unusual topics like this super-fascinating, the best way to tell us is to give us a nod on Ko-Fi. It helps ensure that we can keep this machine moving, support outside writers, and bring on the tools to support our writing. (Also it’s heartening when someone chips in.)

We accept advertising, too! Check out this page to learn more.

Screenshot From 2025-10-25 11-07-07.png
Not the Web, but Web services. Two different things.

2000s: Coming soon, the XInternet

In the 2000s, George Colony had kept up the “Web is dead” talk to emphasize that the Web would be backburnered, but the internet would not.

Colony’s commentary proved endlessly mockable within the IT field, as highlighted by a 2001 roundup of reader criticism that The Register received after writing about it. As one reader put it:

I think your article failed to take into account the larger equation, that nothing is static and that everything is evolving, including technology. Everything described here from both you and your sources has been known since before the birth of the web, before the birth of easily accessible email, and from right around the time of the creation of the Arpanet.

Despite getting his ass handed to him in the Register comments section, Colony still kept it up. He was still on his “websites are static” bit, even though he called the concept he was trying to sell “Web services,” as put in a 2003 Infoworld piece:

To cut confusion about Web services, Colony provided a definition: “Web services are not the Web and not services, but Internet middleware enabling you to link to customers, partners and operating groups.”

The XInternet is an Internet that does not send back “dead pages” when a user makes a request, but sends an executable that allows a user to interact with a Web site. For example, when a user looks for information on how to implement a new human resources procedure, what comes back are implementation and training tools.

Futurists tend to invent their own terms, because sometimes they can prove to be prescient. Gartner’s “Hype Cycle” is an excellent example of this. But Colony’s attempt to rebrand the Web was just not sticking.

But during this period, there was a company that was getting this naming-things strategy right, over and over: The publishing firm O’Reilly Media. Among other things, it came up with the LAMP stack, the Maker movement, and Web 2.0. The XInternet, meanwhile, which is presumably a reference to XML, which was being infused into HTML during this period, went nowhere.

(Such is the challenge of a futurist. At one point during the 2000s, in a piece where he also shared his “Web is Dead“ prediction, he claimed that Oracle would become a commodity player, which turned out to very much not be the case.)

Colony eventually embraced Web 2.0 himself, writing in 2007:

Web 2.0 has forever changed the relationship between your company and your customer. Who best to understand and forge the new relationship? Marketing. Who best to create the technology to get it done? Your business technology/IT group. So, there is only one path: Marketing and technology in your company must work together to design and implement your Web 2.0 strategy. And you, and only you, can get the dogs and cats to interbreed. It’s an unnatural act, but one that must occur before your company can become an opportunist, rather than a victim, in the world of Web 2.0.

It’s weird how the “Web is Dead” guy would be expressing such strong perspectives on how great the Web was. But there would be another shift, and he’d be atop the wave.

WebDead2.jpg
Dude is literally speaking at a conference called Le Web suggesting the Web is gonna die.

2010: It’s gonna be an app ecosystem!

George Colony’s point of view on the Web being dead found a bit of support in the pages of Wired in the 2010s, with its piece “The Web is Dead. Long Live the Internet,” co-written by periodic Tedium subject Michael Wolff.

(The story, frustratingly did not mention Colony or Forrester’s earlier predictions.)

That story drew a bunch of conversation, chatter that Colony himself got after bringing the subject up in tweaked form in 2011, during the LeWeb conference. It’s a bold choice to declare the Web dead at a conference literally called “The Web” in French, but he rolled with it, stating:

The periphery of this network is becoming ever more intelligent. So what this tells us is that several of the old architectures are now dead.

The first model is, of course, the PC model which said put all of your executables on the desktop. But the problem with that model is it doesn’t leverage the cloud so that is a dead model. The second model says, “Oh yeah, put everything in the Web, put everything in the cloud.” And the problem with that direction is that you have to run it through that network, which is improving, but not at the same rate as processor and storage. But it’s not taking advantage of this extraordinary growth and power at the periphery. So we think the web, which as you know, 95% of web executable is at the server, not at your powerful PC, and cloud of course is in the central data center. So we think that that is also an outmoded model.

So what emerges? We see a model emerging that we call app internet it says that we’ll have very powerful services in the cloud data services, etc., connected to and interpolating with very powerful applications on these local devices by the way. When I say local device, I don’t just mean an iPad or mobile. It also means PCs. It also ultimately will mean servers with a connection between the two a transparent interpolation between the application and those services.

And what we have today on Android is a very simple version of this.

This turned out to be a bad prediction. The app and the Web continued to grow, and the cloud was at the center of it. It turns out that the network kept getting faster, and it’s easier to put more powerful servers into the cloud than it is to make your phone or laptop faster. The chips, particularly on the Intel side, also started to stagnate during the 2010s.

I’m sure he talked about the Death of the Web after this, but it really calmed down for a bit, until …

WebDead3.jpg
George Colony, speaking in 2023 about generative AI and the death of the Web.

2020s: AI’s gunning for the Web

So the recent chatter, in the wake of OpenAI’s new browser Atlas (terrible name: the browser does not work like an atlas at all, even metaphorically), is that AI will eat the Web.

And way back in 2023, Colony was at the ready with his go-to commentary on the matter, in a presentation covered by Information Week. Colony’s point is essentially that the Web is disorganized, and ChatGPT is going to organize it.

“It’s all we had. But think about it. (The web) is very, very poorly organized. The web is really a big mess,” he said.

This is very similar to the commentary we’re seeing now, if slightly earlier than most critics. But given the fact that he’s been saying this since literally 1995, barring a short period when Web 2.0 and later the cloud proved him wrong, it loses its bite.

You can think the Web is dying or dead all you want. Many folks have said it over the years. But maybe don’t keep saying it periodically over a 30-year period, in constantly changing contexts, or you’re going to sound like the Chicken Little of the Web.

I, for one, think the Web will do what it always does: Democratize knowledge.

Lively Web Links

Having a 30% profit margin in the game console industry is unheard of, but apparently this is not stopping Microsoft.

John Oliver’s sheer demented passion about the Air Bud series is a sight to behold. This is actually his second video on the topic. It’s suitably bizarre. If HBO gets bought by the Ellison family, he should become an Air Bud YouTuber.

If you have an extra hour and want to hear some amazing emulation history, check out my pal Zophar’s interview with the creator of ZSNES, an iconic Super NES emulator. (Apparently it’s “zed,” not “zee.”)

--

Find this one an interesting read? Share it with a pal! Back at it with some Halloween stuff next week.

(image via DepositPhotos.com)

Ernie Smith Your time was wasted by … Ernie Smith Ernie Smith is the editor of Tedium, and an active internet snarker. Between his many internet side projects, he finds time to hang out with his wife Cat, who's funnier than he is.