Is AI Search the end of click-bait?

Is AI Search the end of click-bait?

I was listening to a podcast the other day about AI search. With the rise of tools like ChatGPT, more and more people are turning away from traditional search engines and toward LLMs that deliver answers directly, without needing to scroll through pages of ads just to get to the real content.

I did exactly this over the weekend while cooking a range of Indian dishes. Normally, I’d Google something like “coconut hoppers” and click the top result - probably BBC Food or Good Food - but that also means wading through ads and filler before getting to the actual ingredients and method. Not this time. I asked ChatGPT to build a menu, generate a single ingredients list to cover everything, and give me the recipes. It wasn’t perfect - I had to separately ask for a couple of missing ones - but overall, it gave me exactly what I needed. A clean [albeit emoji-heavy] screen and just the essentials. I’ll never Google a recipe again.

That’s great for cooking, but what does it mean for search more broadly?

Search Engine Optimisation (SEO) has dominated online publishing for over two decades and while began life as a way to get those pages and sites in front of users who are looking for the information, has unfortunately resulted in the rise of click-bait. No longer can you trust the results page of google for any particular query, because you know that those results listed have descriptions and titles built around enticing clicks, rather than providing the actual content you seek.

I’ve lost count of how many times I’ve clicked on a so-called “tech” article hoping for one useful setting or spec, only to find a thousand words of fluff and no actual information. It’s no wonder people are turning away from search engines entirely.

And what of AI? Well as the LLMs we interact with are tasked with providing the right information in the quickest time, when they are given a search to execute, they do so in a slightly different way. They may initially hit a search engine for a set of ranked results, but rather than just sending back a URL for us, they will interrogate the URL[s] to look at the actual content on each site the then inform or form their response. In my case this would be the Yellow Moog Daal recipe didn’t come with a link to the site, but the actual content was served to me as an ingredients list and step by step method.

So, what happens to SEO when the system ignores titles and descriptions and evaluates content instead? Well, the SEO tactics we’ve relied on - stuffing keywords, optimising headlines, crafting meta copy - matter a lot less. If a page claims to have an answer but doesn’t actually deliver, the AI will skip it. And as these models become more personalised, essentially acting as agents or concierges. The quality of content becomes more important than how it’s dressed up

This is a super-positive step, right?

Well. Maybe. Maybe not.

For sure it is positive that sites purporting to have the answer to XYZ need to either have the answer in their page, or they need to stop saying they have it. However, there are limitations. LLMs often cannot process JavaScript-heavy sites or interrogate images-based information, which means that content that has the answers you are looking for in text form on the page will tend to win out.

There’s also of course the age-old issue of gaming the system. If it exists, someone will try to manipulate it.

And that’s where the next frontier of manipulation may emerge. One example floated is dynamic content where a site responds differently depending on who’s requesting it: browser, search engine, or AI model. Content might be adjusted on the fly to appear more useful to a bot than it actually is to a human. It’s a step into murky ethical waters, but with enough incentive, especially from advertisers, it’s not hard to imagine.

So, while AI might spell the end of click-bait as we know it, it could just be the start of something more covert.

Focus on Now, Not Next.

Focus on Now, Not Next.