PureSEM Blog

Is it Time to Abandon the Google Playbook?

Written by Keith Holloway | Sep 3, 2025 2:46:55 AM

"Should we stop focusing on SEO and paid search because of the rise of AI engines?"

A CEO of a mid-size tech company recently asked me if it's time to stop running their Google inbound playbook.

This is a question on the minds of almost every leadership team that relies on inbound, and one we’re getting asked increasingly often.

The short answer is no. Search isn’t going anywhere, anytime soon.

A strong search presence is crucial for AI/LLM visibility.

The data shows it remains the dominant way buyers discover and evaluate solutions, even as AI tools surge in popularity.

Here's what we're seeing:

  • AI tools usage is growing faster than any technology in history, doubling year over year 

  • Visits from AI Tools, such as ChatGPT, have increased by over 440% across all our clients in the last 12 months

  • Conversions are rising even faster. According to our own data, conversions have increased 792% year over year

  • Google's  AI Overviews have more than doubled in 2025 and zero-click searches are rising along with them 

  • Older attribution methods are breaking down, and search influence will be increasingly hidden in direct and referral traffic

A sampled aggregation of PureSEM's B2B client data shows that traffic has grown 440% and conversions have grown 792% year-over-year. Still, Organic search remains 86 times and 55 times larger, respectively, and conversions haven't dropped.

While this may appear like AI tools have completely upended inbound lead generation, here are some grounding facts:

  • Search activity is actually increasing and dominates non-Google AI Tool usage by at least 20:1

  • According to our own data across purely B2B clients, Organic search still drives 86 times the amount of traffic as LLMs do and 55 times the number of conversions

  • Increased AI overviews are increasing page one visibility, and conversions are not decreasing

  • The hype could make you believe that Google is all but done for, but even if the  current rate doesn't slow down, it will take five years for AI tools to overtake search usage

As the picture becomes clearer, we see that the question isn't whether AI is replacing Google but how Google is changing because of AI.

AI Overviews and AI Agents are using the Google and Bing search indexes to find and display information, so continuing to improve your SEO game is more relevant than ever.

Of course, we must keep our eyes wide open and change quickly, but for now, the data shows that we need to continue relentlessly improving our current practices.

It makes sense to question whether Google is still the dominant place the world looks for answers. 

There will always be new tools, channels, websites, and apps for this; clearly, we're in the midst of a massive technological and cultural shift. Where people go to identify and research problems is evolving, and has been for years. 

YouTube became the second-largest search engine, Reddit became the second-largest website, and TikTok became a product discovery platform. The shift to AI tools is the latest evolution. 

Is it disruptive? Hell, yes.

Has all the traffic moved to ChatGPT or Perplexity? Not even close.

The key is understanding where your specific audience actually looks for answers. 

As leaders, we need to verify this with our own data, not assumptions or anecdotal evidence..

The questions we want to ask are:

  1. Where are our customers actually looking?

  2. What are we seeing in our referral data—and is it telling the whole story?

  3. What does Google's transformation mean for our business?

  4. How do we remain visible and measure results in this new landscape?

The answers will be different for every industry and company, depending on who your audience actually is.

Where customers are actually looking

Despite all these changes, this fundamental truth remains: 

The best way to attract new customers is to show up when your potential customer identifies a problem you can solve.

The headlines would have us believe that everyone's abandoning Google for ChatGPT. Our own usage, or that of our colleagues, may even make us feel that's true. But if you are in marketing, technology, or software engineering like me, you are much more likely to use ChatGPT regularly.

When you look at actual user behaviour data rather than hype cycles, a different picture emerges—one that's more complex and more actionable for your business.

The data shows a 20:1 gap in favour of search

Monthly AI tool page visits have doubled in the last year, but it's still 1/20th of search page visits
Source: Datos State of Search Q2 2025 report 1

According to Datos' Q2 2025 State of Search report:

  • Traditional search still represents 10-12% of all webpage visits, while AI tools account for just 0.6% of total webpage visits

  • This translates to roughly a 20:1 ratio in favor of traditional search for overall usage

  • AI tool usage has more than doubled in the US over the past year (growing from 0.24% of panel activity in April 2024 to 0.64% by April 2025)

  • Even at ChatGPT's impressive growth rate, it will take 5+ years to approach Google's usage levels, assuming nothing radical changes overnight (though it could, which we'll address).

Most AI Usage Isn't Search-Related

Over 70% of generative AI usage is for creation, not information seeking. People use ChatGPT primarily for:

  • Writing and editing content

  • Generating code and debugging

  • Creating images and designs

  • Brainstorming and ideation

ChatGPT is now processing 2.5 billion daily prompts, but only fraction have search-like intent.

When it comes to actual search behavior, like finding and validating information, researching products, and discovering businesses, Google's dominance is even more pronounced than the 20:1 ratio suggests.

The data reveals a paradox: AI tools are growing rapidly in adoption, but traditional search still drives the overwhelming majority of website traffic.

This is partly because AI tools tend not to send traffic out, but in a larger part, because the intent of usage is different.

"The old social contract where Google sends us traffic in exchange for our content is evolving, and not in our favor."

The real disruption is how search is changing

The real concern for companies focused on the Google Playbook of SEO and Google Ads isn't that customers are moving to ChatGPT, it's that search within Google has been disrupted.

Google wouldn't be doing this if it weren't forced to.

Google could have released AI overviews or a chat experience years ago, but it didn’t because it would have hurt its business model.

When someone searches for "the top three alternatives for X," an AI answer gives them three clear answers immediately without clicking through multiple sites, sorting through SEO-optimized fluff, or navigating ads.

Regardless of whether you like, dislike, or even trust AI-generated answers, it's hard to argue that they don’t provide a fundamentally better user experience for most informational queries. Google knows it. 

Google is in the business of serving ads, yet it’s putting AI Overviews at the top of the page, pushing ads further down, and directly reducing its profit per search. 

So why cannibalize it’s own model? 

Google has to protect it’s market share against competitors with fundamentally different incentives. 

The new LLMs don’t have multi-billion-dollar ad businesses to protect, and they don’t need to make a profit, at least yet. They’ve come up with a better UX than search, and if Google doesn’t respond, they’ll lose the privilege of showing ads.

What Google needs to do more than anything is to remain the trustworthy, go-to portal for information, even if that means fundamentally changing how it operates.

What Google doesn't have to do is send traffic to websites.

And this is what we as marketers have to respond to. The social contract where Google sends us traffic in exchange for content has been broken. They don’t have to send us traffic. 

Beyond market share, the only thing they have to protect is the clicking of ads.

The continual shrinking of page one

Here's what's actually happening to our search visibility:

  • Traditional organic results are getting pushed down by AI Overviews

  • AI-generated answers have replaced featured snippets

  • The real estate that generates clicks is shrinking even as impressions are growing

Interestingly, all of the new trends in search have been in place for years; they are just accelerating quickly:

Increasing zero-click searches 

The zero-click search phenomenon isn't new, but AI Overviews are accelerating it dramatically. Users get their answers without ever visiting your website.

You could say it started with featured snippets, but it’s been a much longer trend of disintermediation. 

Buyers have long sought out and served up their own information before they talked to our salespeople. Some of that was on our websites, some on third-party websites. Now, it’s transitioning to AI summaries of information on whatever new surface they may appear on.

With the decline of organic traffic on many websites, we’re hearing more and more reports that the traffic is “better quality”. One theory goes that people do more of their research off your site, and when they arrive, they have more purchase intent.

Anecdotally, I know of one B2B SaaS website that has seen its organic traffic drop 75%, but the number of qualified conversions (sales-qualified leads) has remained the same month over month. 

So it’s not necessarily true that the traffic is better – it’s more likely the 25% that is left was always driving 100% of the results. Most of that top-of-funnel traffic isn’t visiting because they got answers in the AI Overviews. But they may still have been exposed to your brand in the answers.

Increased surfacing of content

Years ago, Google launched “people also ask,” which suggested more related queries. 

This has now gone to the extreme with the automatic generation of dozens of related searches and the compilation of all of the gathered information into an AI-generated summary.

With AI Summaries and the latest AI Mode, Google is doing what is called “Query Fan Out.” This is where many related queries are performed automatically to surface the best related content to answer the user's query from all aspects.

And this can also be seen as the continuation of a trend. 

In the earliest days of search, people searched for one or two words before they realized how specific they could be. The average length of search queries has been inching up year after year to seven, eight, and nine words.

Now, with Chat interfaces, people are searching more in full sentences and paragraphs. Google is even doing that for users, not only suggesting but also performing all the related searches.

So how do we increase visibility and measure results now?

I’ve always liked simplifying search complexity by looking through a framework of keywords, content, links, and code. 

We can use this framework to consider how these forces affect each area of search:

Think concepts over keywords

It’s long been true that 90% of Google searches have never been seen before (or at least in the last 90 days), and that's because of the consistently increasing length of search queries. That's been true before everyone started getting comfortable looking for information through a chat interface.

It’s time to stop chasing and tracking the highest-search-volume keywords individually and start measuring the total search volume and intent of groups of keywords. 

With today’s technology, we can easily cluster all of our related keywords by semantic intent. We could call these keyword clusters “concepts” 

Concepts are a much better way to think about keywords, because you can use many different sets of keywords to describe a specific concept.

So if you weren’t already targeting concepts, and then grouping those concepts into topics, subtopics, and entities, now is the time to start.

Tools like our keyword universe and topic cluster technology are designed just for this. 

With the advent of query fan out, this is far more important, which brings us to content:

Think content quality over quantity

It’s easier than ever to create massive amounts of content, but more of the same content is not the solution. We need more specific, higher-quality content. And probably more of it. 

And I'm sorry to inform you that we need to keep track of it, review it, and refresh it if we want to keep it relevant and useful.

For years, every company focused on SEO has been creating the same top-of-funnel content to generate page one visibility. This generic and duplicate content is providing less value than it ever did, to the point it’s now providing close to zero, if not negative, value.

As I wrote last year in How to Improve SEO Content in a Post-AI World, Google has a huge competitive advantage with its ability to sort through duplicate and spammy content to determine what content rises to the top.

As the amount of generic, general, or near-duplicate AI content increases, it will become increasingly difficult to get your content indexed, let alone ranked on page one.

It’s important to understand that LLMs use Google and Bing's indexes to identify and prioritize important content, surface all related content, and ground their responses.

AI Agents are now performing dozens or hundreds of searches on users' behalf to find the important content and concepts needed to address the very specific intent of the queries.  

Now, with Query Fan Out, this is happening automatically on your users' behalf, whether they know what an agent is or not. 

“As the cost of content becomes closer to zero, the amount of content gets closer to infinite. We must use the new tools to help create higher-quality and much more specific content.”

More content is easy. 

More specific, higher-quality, original, engaging content is the new cost of traffic and conversions. 

As search marketers, we’ve long known that covering the “long tail” was the superior approach, but the amount of content needed was never-ending, and the value of a single piece was never worth the cost.

Over the last few years, we’ve found excellent success by creating Content Hubs that thoroughly cover a topic and all related subtopics, thereby covering many intents and purposes with less content. 

Now we can – and must – do this even better, with more thoroughly researched and original content than ever before, to stand out and become the citations and sources for the LLM answers.

With AI’s help, we can easily record conversations and interviews, turn them into transcriptions, and extract human-generated information quickly and even at scale. 

We can also quickly research personas, fears, frustrations, internal, external, and philosophical problems.

We can map content to the buying cycle for every persona in every industry segment and generate first drafts very quickly for human processing and editing. 

The new problem is how to manage all this content and ensure it’s continually refreshed, accurate, up to date, and not duplicated!

Even more focus on distribution

It’s long been true in Google that if an identical piece of content is placed on multiple websites, the domain with the highest domain authority and page with the most (and most relevant) internal and external links would rank the highest for all the searches related to that piece of content.

“Distribution and brand mentions are as crucial to AI tool visibility as they’ve always been to traditional SEO.”

With the increasing amount of content generated by AI, we’ll need to pay even more attention to links, brand mentions, and domain authority to make our content visible. And not just for Google indexing and ranking visibility but also for the LLMs and AI Agents accessing the Google index on our users' behalf. 

Just like users, AI Agents aren’t going to look past the top 20 results. With an increasing amount of content competing for those coveted spots, our content and the links that support it have to be that much better.

And as these agents consume the content that’s served up through the indexes, they’re also reviewing the brands that are mentioned, and surfacing those more often in their answers.

Distribution matters more than ever...

and...

Code matters but it’s not a silver bullet

The technical side of search mainly involves ensuring you don’t get in the way of allowing your content to be indexed and understood. 

There will always be developers, agencies, or tools that claim that this one technical solution will solve all your visibility problems.

It won’t. 

SEO has always been so much more than technical SEO. 

Don't believe the low-value SEO audits. Schema will not fix your website (although it will help), image alt tags won't increase your traffic (much), and adding unique meta descriptions to low-value pages will not do anything.

Just don't shoot yourself in the foot.

Don’t exclude LLMs from accessing your content! It’s as foolish as excluding Google.

And avoid putting crawl delays in your robots.txt. 

Mainly, ensure you have professional guidance that understands and considers the big picture of information retrieval, not just the technical side. And ensure you don’t think that the next shiny thing (like LLMS.txt) will solve all your problems (also, won’t).

Pay more attention to paid

Google needs to protect and grow it’s ad revenue. It’s becoming increasingly more pay to play and we can expect it to stay 

Google's search ad revenue grew 12% year-over-year while organic clicks decline.

At the same time, average Google CPC increased 13% year-over-year to $5.26

There is less space for advertising, so the price per click is rising as Google is making sure there is more budget chasing the same or less inventory. 

Google is relentlessly pushing its AI products to advertisers with the ‘promise’ of targeting the right person at the right time in a way only AI can deliver.

It sounds great, but this will work best for broad-market consumer products where behavior and demographics provide enough signal for their statistical models to make a huge difference.

The reality for lower-volume, very specific requirements, where intent and firmographics matter much more (like in B2B products)—and where Google has much less information—is that most of these new AI products, like Performance Max, simply don’t deliver. Instead, they waste our budgets and drive an enormous volume of useless traffic.

The same concepts apply to organic search here as well (Unsurprisingly, since there is only one “search”).

We must ensure we’re targeting only the most specific intent and concepts (keywords).

We must identify what’s working through precise and accurate measurement. When we find it, we must maximize it by managing our impression share. We must also guard it by closely managing our search terms to keep Google’s continual intent creep at bay. And we must meet that specific intent with high-quality, engaging, relevant content.

Finally, shift your measurement in line with shifting realities

As search is changing, it makes sense to re-evaluate our measurement.

With zero click results, search is shifting more and more towards an influencing channel from a direct response channel. 

Start measuring visibility and influence

It makes sense to measure:

  • Visibility on page one of Google for all the topics, subtopics, and concepts that matter to us. Stop focusing on a select few high-volume keywords and start looking at the big picture

  • Visibility on Bing. Since ChatGPT uses Bing (as well as Google) for searches, because of their involvement with Microsoft

  • Changes to branded search, and consider what is influencing these changes

  • Direct traffic to pages other than your home page. Since a lot of referral tracking is breaking down, tracking visits to deeper pages can show you the intent of the visit, and give you more intelligence on the content that is working, especially in LLMs.

One helpful way to measure all of your traffic is to group it by topic and subtopic, so that you can see how your content is influencing traffic and conversions regardless of the channels.

PureSEM organizes your visibility, traffic, web conversions, leads, and sales pipeline by topic and subtopic—right through your entire sales and marketing funnel.

Continue measuring pipeline contribution

As B2B marketers, it’s still crucial to evaluate all channels against their efficiency in creating qualified leads, sales pipeline, and revenue.

While first click/first touch tracking has become more difficult every year, it’s still an essential tool for budget allocation. 

The rising use of desktop apps for AI and more and more surfaces has introduced yet another place where referral source tracking is breaking down. (A visit from a link clicked in a standalone app, i.e., a non-web browser, is going to show up as direct.)  

So we just have to remember that first click tracking should looked at directionally rather than absolutely so that we don’t underinvest in channels that are difficult to measure. The full impact of our trackable channels is almost certainly much larger than we can measure.

We need to measure visibility and correlate increases in visibility to changes in branded and direct traffic. 

Stay current with advances in LLM tracking

There is a lot of hype around new tools to track brand mentions and visibility within AI tools, and this is worth paying attention to, but we have to remember these two important things:

These tools are in their infancy. 

AI Tools like ChatGPT, Perplexity, etc., don’t share any usage data, so it’s not necessarily accurate. Just because you can make up a bunch of prompts and check the responses every day, doesn’t mean this is what actual users are seeing.

What are we going to do with this information? 

To improve our visibility in LLMs, we need to improve our visibility in Google and all the places online that matter to our users. This will be achieved by improving upon the hard-won strategies and tactics we already know.

The new game is the same as the old game

The new game is a lot like the old game. It’s just a bit harder and with better and more interesting technology. 

We have to keep our eyes open and follow our audience and traffic, but in two or three years, the fundamentals of search visibility will still matter. 

Domain authority, quality content, and user trust aren't going away. What's changing is the interface layer. 

We expect to see more and more AI-generated content curating the Internet for us. But the information that is presented to us and our future customers about our industries and out products will still largely come from the same sources – what thesearch engines' crawl, indexed and rank.

The game is still the game. It's just being played on a gradually shrinking field with evolving rules.

Are you looking to change how you navigate this transition?

Let's talk! PureSEM provides the tools and services to:

  • Plan, develop, and optimize your content strategy 

  • Measure influence and attribution connected to your sales pipeline

Schedule a Demo to see how we're helping B2B companies thrive in this transition.

1 Datos State of Search Q2 2025 report: https://datos.live/report/state-of-search-q2-2025/