Google AI Overviews are absorbing the clicks that used to go to organic results. One study of 300,000 keywords measured a 58% drop in click-through rate for the #1 position when an AI Overview appears. A separate analysis of 3,119 queries found a 61% drop across 25.1 million impressions. The numbers aren't subtle. If your keywords trigger AI Overviews, your organic traffic model has already changed.
(The industry calls the practice of adapting to this shift GEO (Generative Engine Optimization), AEO (Answer Engine Optimization), LLM SEO, or AI SEO. Different names for the same work: structuring content so AI engines cite it.)
The CTR collapse is real and getting worse
A study comparing 300,000 keywords with and without AI Overviews using Google Search Console data from December 2023 (pre-rollout) and December 2025 found that position one CTR for keywords with AI Overviews fell from 7.3% to 1.6%. For keywords without AI Overviews, the same position dropped from 7.6% to 3.9%. Both declined, but the AI Overview group lost more than twice as much.
A separate dataset across 42 organizations tells the same story from a different angle. Organic CTR on informational queries with AI Overviews dropped from 1.76% to 0.61% between June 2024 and September 2025. Paid CTR fell even harder, from 19.7% to 6.34%.
Here is the part that doesn't get enough attention: even queries without AI Overviews saw organic CTR fall 41% in that same dataset. Users are clicking less everywhere. AI Overviews accelerate an existing behavior shift, they don't create it from scratch.
Being cited inside the overview changes everything
The overall CTR drop is severe, but it's not uniform. Research across 3,119 search terms and 42 organizations found that when your site is cited inside an AI Overview, you get 35% more organic clicks and 91% more paid clicks compared to queries where you're not cited.
This creates a binary outcome that traditional SEO never had. In the old model, position #3 got fewer clicks than position #1, but it still got clicks. With AI Overviews, the gap between "cited in the overview" and "not cited" is closer to on/off than it is to a gradient.
Think about what this means for a query like "best project management software for remote teams." If the AI Overview cites three tools by name and links to their pages, those three capture the lion's share of remaining clicks. Everyone else, even if they rank on page one organically, fights over what's left after a 58-61% haircut.
AI Overviews are expanding beyond informational queries
The early assumption was that AI Overviews would stay confined to simple informational queries. "What does tax loss harvesting mean" or "how does photosynthesis work." That assumption is already wrong.
An analysis of 10 million keywords tracked the intent distribution of queries triggering AI Overviews throughout 2025. In January, 91.3% of AI Overview queries were informational. By October, that share had fallen to 57.1%. The shift went to commercial and transactional queries:
- Commercial intent queries grew from 8% to 19% of AI Overview appearances
- Transactional queries grew from 2% to 14%
- Navigational queries grew from under 1% to over 10%
BrightEdge's data makes the commercial expansion concrete. "Best [product]" queries, the kind that drive affiliate revenue and buyer's guide traffic, now trigger AI Overviews 83% of the time. That's up from 5% a year earlier. If your business depends on ranking for "best sunscreen for oily skin" or "best preschool in Bangalore," the SERP you optimized for no longer exists.
The coverage numbers keep climbing
As of March 2026, BrightEdge reports AI Overviews appearing on roughly 48% of all tracked queries, up 58% from December 2025. Conductor measured 25.1% across 21.9 million queries in Q1 2026. The variance depends on methodology and query mix, but the direction is consistent: up.
And this is before factoring in Google AI Mode, which launched in May 2025 and now has over 75 million daily active users. AI Mode is a full conversational search interface, not just a snippet at the top. Ninety-three percent of AI Mode searches end with zero clicks to external sites.
The trajectory is clear. More queries will trigger AI-generated answers. More of those answers will cover commercial topics. And the share of traffic going to traditional organic results will keep shrinking. This is also why analyzing AI Overviews in isolation is insufficient. Google AI Mode, ChatGPT, and Claude each have different citation behavior for the same query. ChatGPT retrieves from Bing's index, not Google's, so a brand cited in AI Overviews may be invisible in ChatGPT. Yext found only 11% domain overlap across ChatGPT and Perplexity, confirming that each engine is effectively a separate channel.
What this means if you rely on organic traffic
If your customer acquisition depends on Google organic traffic, you have two problems, not one.
Problem one: your existing traffic is declining. Every keyword where an AI Overview appears represents a 58-61% CTR haircut to your organic position. This is already happening in your Google Search Console data, though it may be masked by seasonal trends or other channels.
Problem two: AI Overviews are not your only AI search problem. Most coverage of AI Overviews treats them in isolation. That is a mistake. A brand's ChatGPT visibility is usually completely different from its AI Overview visibility, and both differ from Claude. We call this the engine divergence problem, and it is one of the first things CiteGap audits expose.
We audited a mid-size D2C brand that ranked on page one for most of their target keywords. Their organic traffic had dropped 23% over six months, and they assumed it was an algorithm update. When we checked which of their keywords triggered AI Overviews, the answer was over 60%. But the more revealing finding came from running the same queries across ChatGPT and Claude. The brand was cited in Google AI Overviews for 4 out of 10 target queries, in ChatGPT for 1 out of 10, and in Claude for 0. Three engines, three completely different visibility profiles, three different sets of content failures.
Google AI Overviews favored pages that already ranked well but had structured data and direct answers. ChatGPT ignored pages that lacked comprehensive, encyclopedic coverage with statistics. Claude, which retrieves via Brave Search rather than Google, pulled from an entirely different source pool. The brand's SEO playbook addressed one engine's requirements and accidentally ignored the other two. The divergence between ChatGPT and Google AI is the most common gap we find in audits, and Yext's analysis of 6.8 million citations confirmed only 11% domain overlap even between ChatGPT and Perplexity. Fixing your AI Overview presence and assuming the rest follows is the same mistake as fixing Google rankings and assuming AI search is covered.
What this means for your brand
The brands most exposed are the ones treating AI Overviews as the entire AI search problem. They fix their Google AI presence and assume ChatGPT and Claude follow. They don't.
In CiteGap audits, we measure visibility independently across each engine. The typical finding is that brands have completely different citation profiles on each one. A brand might be mentioned and linked on Google AI (strong SEO gives them a head start) but mentioned without links on ChatGPT (which weights different content signals) and completely absent on Claude (which retrieves from Brave, not Google).
This engine divergence is why single-engine optimization is a trap. Every post about "how to get cited in AI Overviews" treats the problem as one-dimensional. The real problem is three-dimensional at minimum. Identifying which of your pages fail on which signals, for which queries, across which engines, is not something a manual spot check can reveal. A brand might check five queries on Google AI and see two citations, which feels like 40% coverage. CiteGap tests across 100+ queries per engagement, across ChatGPT, Google AI, and Claude, and consistently finds the real number is much lower once you account for query variation, engine differences, and non-deterministic AI responses.
FAQ
How much does an AI Overview reduce organic clicks? A study of 300,000 keywords measured a 58% CTR reduction for position one when an AI Overview appears, comparing December 2023 to December 2025. A separate analysis found a 61% organic CTR drop across 3,119 queries and 42 organizations. Both studies used Google Search Console data.
Does being cited in an AI Overview help your traffic? Yes. Research across 3,119 search terms and 42 organizations found that sites cited inside an AI Overview receive 35% more organic clicks and 91% more paid clicks compared to queries where they are not cited. The difference between being in the overview and being below it is the single largest CTR factor.
Are AI Overviews only for informational queries? No. An analysis of 10 million keywords found that informational queries dropped from 91% to 57% of AI Overview triggers during 2025. Commercial queries grew from 8% to 19%, and transactional queries grew from 2% to 14%. BrightEdge found "best [product]" queries now trigger AI Overviews 83% of the time.
What percentage of Google searches show AI Overviews? It depends on the dataset. BrightEdge reports 48% of tracked queries as of March 2026. Conductor measured 25.1% across 21.9 million queries in Q1 2026. Coverage varies by industry and query type but is growing consistently.
Can I still get organic traffic if AI Overviews appear for my keywords? Yes, but less of it, and only if your content is structured for citation. Pages structured to answer the query directly and provide verifiable data are more likely to be cited in the overview and capture the remaining clicks. The specific changes vary by page, query, and engine.
CiteGap measures your visibility independently across Google AI, ChatGPT, and Claude, showing where each engine cites you, where it mentions you without linking, and where you are absent entirely. Request a consultation.