Snip Snip
A new lawsuit by a major publishing conglomerate takes aim at Google’s AI summaries—and hints at the many ways that Google undermines its own mission by forcing unwanted features on its users.
I don’t know if you’ve heard, but Google has not had a particularly amazing past month or so. While it avoided the worst consequences of the potential breakup they faced, the company is not out of the woods. There will always be other companies and suitors pushing against it.
(On the other hand, it is also now a $3 trillion company, so let’s just say it’s good and bad.)
And one of those suitors emerged over the weekend. Penske Media, the publisher of The Hollwood Reporter, Variety, Deadline, and Rolling Stone, took a big swing at the company’s AI-generated summaries, which have been designed in a way to dominate search results. I went on PACER and downloaded the complaint. A sample passage kind of nails down Penske’s main complaint with Google’s snippet endeavor:
Google sources the content it uses to populate its publishing elements from the data that it crawls for its search index. In other words, Google repurposes the Search Index Data digital publishers provide as republishing content. As discussed below, Google has now doubled down on this unauthorized conduct not only for Featured Snippets, Top Stories, and People Also Ask but also for its GAI products, including AI Overviews and AI Mode. Google’s unauthorized republication of digital publishers’ content for its GAI products is uniquely harmful for publishers at least because the GAI products more comprehensively republish content—rather than providing merely a preview or, in its own words, a “snippet”—thus obviating the need for users to click on the publishers’ website links at all.
To translate: Publishers agreed to let Google do one thing, but then went and did something else that ultimately harmed the publisher. I know, huge shock, right? But Penske, an increasingly powerful company in the media sphere, is a relatively formidable foe, with enough exposure to Google’s whims that it may have a real impact.
From a publisher standpoint, Google is an essential part of the way many large publishers operate. So when Google adds features to discourage people from clicking into articles, it causes a loss of broader context for the reader and a loss of revenue for the website.
The thing is, the publisher was never given a choice in this equation—other than a “nosnippet” tag they can add to their website, which comes with a loss of traffic—and neither was the user. When I worked on &udm=14 a year ago, this loss of choice for the user was a big reason why I did it.
But to hear it from Google, as its vice president of government affairs and public policy Markham Erickson put it yesterday, is to hear a company that thinks it knows what you want better than you do:
So, I don’t want to speak about the specifics of the lawsuit, but I can speak to our philosophy here, which is, look, we want a healthy ecosystem. The 10 blue links serve the ecosystem very well, and it was a simple value proposition. We provided links that directed users free of charge to billions of publications around the world. We’re not going to abandon that model. We think that there’s use for that model. It’s still an important part of the ecosystem.
But user preferences, and what users want, is also changing. So, instead of factual answers and 10 blue links, they’re increasingly wanting contextual answers and summaries. We want to be able to provide that, too, while at the same time, driving people back to content, valuable content, on the Internet. Where that valuable content is for users, is shifting. And so it’s a dynamic space. Ultimately, our goal is to ensure that we have an overall healthy ecosystem.
/uploads/Google_SearchWindow.jpg)
Google: We Know What You Want
This response speaks to a broader track record by Google and its parent company, Alphabet, to introduce changes that gradually tighten the screws on our user experience—and then deny people who disagree. (A recent example that comes to mind: Google’s decision to add a Gemini button to its Gmail client, putting it in a spot that previously was meant for switching accounts—and encouraged users to press it by accident.)
This is not the first time Google has done something like this. One might argue that bundling is kind of their play. In 2011, Google launched Google+, its most aggressive attempt at a social network. At the time, social media was seen as a major competitor to the company’s search dominance. So their solution was to shove it everywhere and into everything. But it also put odd restrictions on how users could use these tools, attempting to force them into a specific shape—so that everyone had to use their real name and real identity.
It took four freaking years for Google to admit that its aggressive seeding of Google+ was destroying the core Google and YouTube experiences.
The company knows that if it wants to shape the internet in its image, it just needs to offer the right forced incentives. People and companies eventually acquiesce. Just think of all the sites that set up Google AMP versions not because they wanted to, but because they felt they had to.
Usually, publishers feel like they get something from this arrangement. But AI overviews offer little in return to them. And for users, they flatten the process of research, an important skill for people to have in a healthy society.
Sponsored By … You?
If you find weird or unusual topics like this super-fascinating, the best way to tell us is to give us a nod on Ko-Fi. It helps ensure that we can keep this machine moving, support outside writers, and bring on the tools to support our writing. (Also it’s heartening when someone chips in.)
We accept advertising, too! Check out this page to learn more.
The alt-country skit that a Google AI summaries could obliterate in one search.
Why AI Summaries Betray Google’s Purpose
Ryan Adams is understandably a pariah these days (see our NYT linking policy). But in terms of framing what AI overviews are supposed to do, I always go back to the opening track on his best record, Heartbreaker, “Argument with David Rawlings Concerning Morrissey.” It is the purest example of the problem Google is trying to solve with AI search, and I mentioned this example back in my 2018 piece on Google. In the 30-second skit, the singer and his backing guitarist debate which album the song “Suedehead” is on.
It is a fun discussion, and when I covered it in 2018, the question would have been answered in two or three minutes with a little Googling and a quick detour to Wikipedia or a news article. But now, Google just answers the whole thing with an AI overview:
/uploads/Screenshot-From-2025-09-16-19-26-05.png)
Google thinks this is useful because this saves you time, but it honestly feels like it just flattens our sense of discovery for the sake of hitting a number. Where’s the fun in that?
Let’s put this in Penske Media terms: You could have landed on this profile of Morrissey’s Viva Hate that ran in Billboard a few years back, perhaps this Rolling Stone listicle that mentions some of Morrissey’s best moments. And maybe you might have had a laugh at this Variety piece that discusses how mad Morrissey is at Johnny Marr for torpedoing a Smiths reunion.
But no. You stopped as soon as you hit Google, and saw that “Suedehead” is on Bona Drag and Viva Hate. Google satiated your knowledge—but it killed your curiosity.
Does Google understand why people look up information? Given that this is the discussion we’re having about them 28 years after its founding, it sure doesn’t feel like it!
The lawsuit Penske filed against Google is essential because it points at this disparity. Google sold us on quality searches. It’s now offering something else, and that something else is not what its users want.
Reward curiosity, Google. Don’t snip it away.
Google-Free Links
Famed journalist Steven Levy, definitely anything but a tech luddite, embraces the idea of getting a settlement from Anthropic. But in his piece for Wired, he points at a deeper problem: “Of course, even if authors do get four- or even five-figure sums for the use of their books to train AI, that would not address the most serious problem of all—the fact that people aren’t reading.”
I wondered about Local H today. I hadn’t thought about that band in quite a while, and I’m realizing now that this was a mistake, due to a failure of the monoculture. Anyway, here’s “All The Kids Are Right,” the best song of theirs that isn’t “Bound For The Floor.”
Look what you did, matcha fans: You made it too popular and spiked demand!
--
Find this one an interesting read? Share it with a pal! (And if you have another example similar to the Ryan Adams thing that doesn’t involve Ryan Adams, let me know. Would love to have it.)
Top image via DepositPhotos.com