Normal view

Yesterday — 23 March 2026Search Engine Land

EU signals imminent decision on Google DMA probe

23 March 2026 at 20:36
Google vs. publishers: What the EU probe means for SEO, AI answers, and content rights

The EU’s top antitrust enforcer signaled a decision on whether Google is violating the Digital Markets Act is imminent, without committing to a timeline.

What she said. “It will come,” Competition Commissioner Teresa Ribera told Dow Jones Newswires, adding the cases are complex and the commission is committed to decisions based on evidence and fair procedure.

The backdrop. The European Commission launched its probe into Google’s search business in March 2024 under the Digital Markets Act. The commission gave itself a soft 12-month deadline to wrap up — it has already fined Meta and Apple, but Google’s case remains unresolved nearly two years in.

The pressure is mounting. Eighteen lobby and civil society groups wrote to Ribera this month demanding clear remedies and a fine large enough to make non-compliance unprofitable.

  • The groups warned the commission’s credibility is on the line, noting Google controls over 90% of the EU search market.
  • “Every day without a decision is a day that European businesses are systematically disadvantaged,” the letter said.

Why we care. A ruling against Google under the Digital Markets Act could force major changes to how it operates search in Europe — potentially reshaping how ads are served, ranked, and priced in one of the world’s largest markets. If remedies include structural changes to search or ad tech, it could affect campaign performance, targeting, and competition dynamics across the board. If you have European audiences, watch this closely — the outcome could ripple through Google’s global ad ecosystem.

Meanwhile, this week. Ribera is in California meeting Sundar Pichai, Mark Zuckerberg, Sam Altman, and Amazon’s Andy Jassy before heading to Washington, D.C., for talks with the acting head of the Justice Department’s antitrust division.

The big picture. Google isn’t the only one in the crosshairs. The commission has additional open probes into how Google powers AI Overviews and ranks news publishers, and is separately investigating Meta over restrictions on rival chatbots using WhatsApp’s business software.

Bottom line. The EU has been slow to act on Google, but pressure is clearly building. When the decision lands, it could set a significant precedent for how the Digital Markets Act is enforced.

How AI-generated content performs in Google Search: A 16-month experiment

23 March 2026 at 20:00
AI content rise and fall

With AI, you can generate dozens (if not hundreds) of articles in hours and publish at scale. But publishing is the easy part. What happens after they go live is what matters.

Together with the research team at SE Ranking, we ran a 16-month experiment to track how well AI-generated content performed on brand-new domains with zero authority.

As you will see, the results are hard to call a success.

Here’s the full story behind our experiment.

Methodology

The goal was simple: test how far AI content — with no human editing, rewriting, or enhancement — could go in search.

How quickly would it get indexed? Could it rank for relevant queries? Most importantly, could it drive traffic?

We started by purchasing 20 new domains with no backlinks, domain authority, brand recognition, or search history.

Each domain focused on a different niche, covering topics such as:

  • Arts & Entertainment
  • Business & Services
  • Community & Society
  • Computers & Technology
  • Ecommerce & Shopping
  • Finance & Accounting
  • Food & Drink
  • Games & Accessories
  • Health & Medicine
  • Industry & Engineering
  • Hobbies & Interests
  • Home & Garden
  • Jobs & Career
  • Law & Government
  • Lifestyle & Well-being
  • Pets & Animals
  • Science & Education
  • Sports & Fitness
  • Travel & Tourism
  • Vehicles & Boats

For each niche, we gathered 100 informational “how-to” keywords—long-tail terms with lower competition.

Each site received 100 AI-generated articles, totaling 2,000 pieces across the experiment.

After publishing, we added the sites to Google Search Console and submitted sitemaps.

From that point on, we left the sites untouched to observe performance over time.

Timeline & key results 

Month 1: indexing and early visibility

About 71% of new AI-generated pages were indexed within the first 36 days. They generated over 122,000 impressions and 244 clicks. Even at this early stage, 80% of sites ranked for at least 100 keywords each.

Months 2–3: growth continues

Cumulative impressions grew to over 526,000, with 782 clicks. Content continued to perform well without backlinks, promotion, internal linking, or additional SEO tactics.

Months 3–6: ranking collapse

By about three months, only 3% of pages remained in the top 100. Early relevance helped pages get indexed and briefly appear in search, but without authority, uniqueness, or E-E-A-T signals, rankings dropped sharply. Google still indexed the pages, but users rarely saw them.

Month 16: long-term stagnation

After over a year, visibility remained low across most sites. Impressions and clicks were minimal, and no site showed meaningful recovery. After the August 2025 Google spam update, pages ranking in the top 100 rose to 20% — up from 3% at six months.

Month 1: indexing and early visibility

Just over a month after publication (36 days), the first results came in — and they were stronger than expected for brand-new sites.

Of 2,000 articles, 70.95% were indexed (1,419 pages). For zero-authority domains, that’s notable, as getting new sites fully indexed is often a challenge. This shows Google is still willing to crawl and index AI-generated content in most cases.

Some sites performed particularly well. Eleven of the 20 domains had all 100 pages indexed.

  • Most were in broad, evergreen niches like Food & Drink, Home & Garden, Jobs & Career, and Lifestyle & Well-being.
  • More competitive or specialized areas, like Ecommerce & Shopping, saw slower indexation, likely due to stricter evaluation.

Along with indexation came early visibility. During this first month, the sites collectively generated:

  • 122,102 impressions
  • 244 clicks

Several niches stood out generating more than 10,000 impressions in the first month alone.

  • Hobbies & Interests: 17,425 impressions
  • Business & Services: 17,311 impressions
  • Travel & Tourism: 13,598 impressions
  • Lifestyle & Well-being: 13,072 impressions
  • Law & Government: 11,794 impressions
  • Games & Accessories: 11,083 impressions
  • Vehicles & Boats: 10,677 impressions

In terms of keyword coverage, many sites performed surprisingly well within the first month. Eight sites ranked for more than 1,000 keywords, while another eight ranked for 100 to 1,000.

Even at this early stage, 80% of sites with fully AI-generated content appeared in search for hundreds or thousands of queries.

Notably, over 28% of ranking URLs were already in the top 100. Within the first month, many pages reached positions where searchers could see them.

Overall, these results show AI-generated content can gain traction quickly—even without backlinks, editorial input, or additional SEO work. In the short term, content alone was enough to get indexed and appear in search.

Months 2–3: growth continues

This early visibility wasn’t short-lived. Over the following weeks, impressions and clicks kept growing as Google Search discovered and tested pages.

By about two and a half months after publication, cumulative results across all sites had grown:

  • Impressions: 122,102 to 526,624
  • Clicks: 244 to 782

Keyword coverage also expanded:

  • 12 sites ranked for 1,000+ keywords (up from 8 in the first month).
  • The remaining 8 sites ranked for 100–1,000 keywords.

This pattern is typical for new sites. When Google finds fresh content that matches real queries, it tests that content across results. Pages appear for related queries as Google evaluates their helpfulness.

That’s what happened here. Even without backlinks, internal linking, or SEO improvements, the content gained exposure because it targeted low-competition queries and followed basic SEO structure.

At this stage, it could look like a strong case for large-scale AI content. The sites were new, the content fully AI-generated, and impressions kept rising.

But the growth didn’t last.

Month 3-6: the ranking collapse

Around Feb. 3, 2025, roughly three months after publication, the experiment hit a turning point.

  • Only 3% of pages remained in the top 100, down from 28% in the first month. 

In practical terms, the content remained indexed but rarely appeared where users could see it.

Early relevance can help pages get indexed and appear in search results for a time. Without stronger signals — authority, E-E-A-T, unique insights — those rankings are hard to sustain.

By the six-month mark, Google Search Console showed the following cumulative totals across all sites:

  • Impressions: 526,624 to 706,328 
  • Clicks: 782 to 1,062

At first glance, these numbers suggest continued growth. But that’s not what happened.

Most activity occurred early. In the first 2.5 months, the sites generated roughly 70% to 75% of total impressions and clicks. Over the next 3.5 months, growth slowed sharply, adding only 25% to 30%.

Month 16: the long-term picture

The experiment ran for over a year to see if rankings would recover.

For the most part, they didn’t.

After the drop around the three-month mark, visibility remained extremely low for the rest of the experiment.

There were a few brief fluctuations. The most notable came in late August 2025.

Starting in August, 50% of sites (10 out of 20) saw a two-week spike in impressions. This closely aligned with the rollout of the Google August 2025 spam update, which began Aug. 26.

However, the boost didn’t lead to a sustained recovery.

Among the sites that saw a short-term lift:

  • Six quickly lost visibility and returned to prior lows
  • Four maintained slightly improved performance, similar to early post-publication levels

Following the update, pages ranking in the top 100 rose to 20% — up from 3% at six months. This remained below the 28% seen in the first month, but the August 2025 spam update appeared to have improved some rankings.

In total, 66.9% of pages were still indexed, up slightly from 61.45% at six months.

The following sites had some of the lowest numbers of indexed pages:

  • Finance domain (9 of 100)
  • Health domain (14 of 100)

This is likely due to their YMYL nature, where Google applies stricter quality and trust standards.

By month 16, cumulative results across all sites were:

  • Impressions: 706,328 to 1,092,079
  • Clicks: 1,062 to 1,381

Most impressions still came from the early growth phase, before rankings dropped.

Why SEO visibility didn’t last

The most obvious explanation is that the content didn’t meet Google’s quality standards — and understandably so.

The 2,000 articles lacked many signals Google uses to assess quality and trust:

  • Authority. No backlinks or external validation. Without these, new domains struggle to compete with established sites.
  • Expertise and credibility. No authors, credentials, or real-world expertise — especially critical in finance, health, and law.
  • Content differentiation. Much of the content resembled what already exists. Without unique insights, pages struggle to stand out.
  • Site structure. No internal linking, topical organization, or clear hierarchy to help Google understand page relationships.

Google can identify AI-generated patterns. Without authority, uniqueness, or supporting signals, early visibility declines.

Bonus insight: how new AI content supports existing pages

In early March 2026, we ran a follow-up experiment, adding new AI-generated content to eight tracked sites.

As of March 13, not all new content has been indexed. However, sites with new content already show a noticeable increase in search impressions.

Interestingly, this lift comes primarily from older posts, not the newly published ones.

For example:

  • Business-focused website (from 458 impressions in February 2026 to 7,750 impressions in March 2026)  – 17x increase.
  • Law-focused website (from 19 impressions in February 2026 to 356 impressions in March 2026)  – 19x increase.
  • Science-focused website (from 34 impressions in February 2026 to 633 impressions in March 2026)  – 19x increase.

This experiment shows that publishing new content—even fully AI-generated—can lift traffic to older pages that had been stagnant for months. Fresh content may signal to Google that the site is active and up to date, giving the site a temporary boost.

However, these are early results and don’t guarantee lasting gains in rankings or traffic.

Key takeaway: AI can speed up content creation, but not replace SEO

The results of this 16-month experiment don’t mean AI content is useless. They show AI alone isn’t enough to drive lasting impact.

Early traffic and impressions may look promising, but without a clear SEO strategy and human guidance, those gains will likely fade within a few months.

Google Ads API to block duplicate Lookalike user lists

23 March 2026 at 19:39
In Google Ads automation, everything is a signal in 2026

A quiet but important change is coming to the Google Ads API that will affect how advertisers and developers create Lookalike user lists, especially for Demand Gen campaigns.

What’s changing. Google will enforce a uniqueness check on Lookalike user lists, blocking duplicate lists with the same seed lists, expansion level, and country targeting. Attempts to create a duplicate will return an API error after April 30.

Why we care. If you use automated scripts or third-party tools to generate audience lists, an unhandled error could quietly break your campaign workflows if you don’t update integrations in time.

What you need to do.

  • Audit existing Lookalike lists and reuse ones that already match your intended configuration rather than creating new ones
  • Update your API error handling to catch the new DUPLICATE_LOOKALIKE error code in v24 and above, or RESOURCE_ALREADY_EXISTS in earlier versions

Bottom line. This is a housekeeping change to keep Google’s systems stable, but the April 30 deadline is firm. If you manage campaigns programmatically, treat this as a technical to-do before the end of April.

Google’s announcement. Upcoming changes to Lookalike user lists in the Google Ads API, starting April 30, 2026

ChatGPT ads pilot leaves advertisers without proof of ROI

23 March 2026 at 19:27
ChatGPT growth

OpenAI is moving forward with ads in ChatGPT, but early adopters say it isn’t ready for serious performance marketing.

The big picture. ChatGPT’s ad product shares almost no data, lacks automated buying tools, and offers minimal targeting—leaving advertisers with little ability to measure whether their spend is doing anything, The Information reported.

What advertisers are dealing with. SEO consultant Glenn Gabe outlined the issues:

  • No automated way to buy ad space — deals happen over calls, emails, and spreadsheets.
  • No meaningful performance data to evaluate outcomes.
  • Two agency executives told The Information they couldn’t prove the ads drove measurable business results for clients.

Why we care. If you’re considering ChatGPT as an ad channel, the lack of performance data means you’re spending blind — with no reliable way to prove ROI to clients or stakeholders. As OpenAI prepares to scale ads to all U.S. free users, the audience will grow, but measurement tools haven’t caught up. If you jump in now, keep expectations tight and treat it as experimental budget, not a performance channel.

What’s coming. OpenAI told advertisers it plans to show ads to all U.S. users on free and low-cost ChatGPT tiers in the coming weeks — a major expansion. It also advised that performance may improve if you supply more variations of text and visual creative.

The irony. OpenAI builds some of the world’s most sophisticated AI, but its ad reporting tools are stuck in the spreadsheet era.

Bottom line. ChatGPT ads are about to reach a much larger audience, but there’s no way to prove they have value yet. If you enter now, you’re largely flying blind — and paying for it.

Credit. Gabe shared highlights from The Information‘s article (subscription required) on X.

Why zero-click search doesn’t mean zero influence

23 March 2026 at 19:00
Why zero-click search doesn’t mean zero influence

In a recent keynote at the Industrial Marketing Summit, Rand Fishkin argued that we’re marketing in a “zero-click world.” His observation captures an important surface-level trend: fewer users are clicking through to websites.

The deeper shift, however, is structural. What has changed is the way information is evaluated, repeated, and trusted across the web — and that’s where many are drawing the wrong conclusion.

As clicks decline, it can look like websites matter less. In reality, their role in shaping what gets seen and trusted may be increasing.

Why ‘zero-click’ discussions often lead to the wrong conclusion

From a traffic perspective, the trend is unmistakable. Clicks are declining in many contexts.

  • Search engines now answer many questions directly on the results page.
  • Social platforms function as discovery engines where people research ideas, products, and services without leaving the platform.
  • AI assistants synthesize answers from across the web before a user ever sees a list of links.

Part of the reason the zero-click discussion resonates so strongly is that it disrupts the way we’ve historically measured visibility. For more than two decades, traffic and click-through rates have served as the primary signals for forecasting performance and evaluating the impact of search.

When answers appear directly in search results, AI summaries, or platform conversations, those interactions often occur outside the analytics frameworks we’re accustomed to using.

The conclusion many draw from this trend — that websites matter less — is an incomplete assessment. The role of websites is changing, but their importance in the information ecosystem hasn’t disappeared. In some ways, it may be increasing.

The reason has to do with how modern information systems determine what to trust. Large language models and AI-driven search interfaces don’t evaluate truth the way humans do. They rely on probabilistic signals drawn from the information available across the web.

When the same message appears consistently across multiple independent sources, the statistical likelihood that the information is correct increases. Visibility in this environment is determined by where information appears.

Dig deeper: Why surface-level SEO tactics won’t build lasting AI search visibility

Fishkin is right about the trend

The fragmentation of discovery is real. Information consumption now happens across many environments: search results, social feeds, community forums, video platforms, and AI interfaces.

Users frequently encounter answers without needing to click a link. 

  • A search result might contain an AI summary. 
  • A product recommendation might appear in a Reddit thread. 
  • A professional insight might circulate on LinkedIn.

From a traditional web analytics perspective, these interactions can appear as lost traffic. However, focusing exclusively on clicks misses the more important question: where does the information itself originate?

The environments where people consume information are expanding, but the underlying knowledge those systems rely on still has to come from somewhere.

Zero-click doesn’t mean zero influence

The critical distinction you need to understand is the difference between traffic and information influence.

  • Traffic measures whether a user visited your website. 
  • Influence measures whether the information you produced shaped the answer someone received.

AI systems don’t generate answers out of thin air. They construct them from patterns learned across the open web.

When an LLM answers a question about a legal issue, a technical concept, or a marketing strategy, it draws on the analysis, explanations, and original thinking that publishers have already placed online.

Even in a zero-click environment, those sources continue to exist. They continue to shape the answers. The difference is that influence increasingly occurs earlier in the information pipeline, before the user even reaches a website.

Fewer clicks don’t mean fewer sources. In practice, it often increases the value of authoritative sources because AI systems depend on them to construct coherent responses. Without expert explanations, detailed analysis, and original insight, there’s nothing for the system to synthesize.

Dig deeper: Is SEO a brand channel or a performance channel? Now it’s both

Get the newsletter search marketers rely on.


The role of ‘rented land’

In discussions that follow the “zero-click world” framing, the recommendation is that brands should focus more heavily on platforms they don’t control — social networks, communities, and other forms of “rented land.”

Brands can think of their visibility footprint as two categories of territory: 

  • Owned land, where they control the infrastructure and content.
  • Rented land, where their message appears on platforms they do not control.

Owned land includes assets such as a company website, product documentation, knowledge bases, and other first-party content environments. These are places where a brand controls the structure, the message, and the permanence of the information.

Rented land includes platforms such as LinkedIn, Substack, industry publications, forums, podcasts, and social media environments where the brand participates but does not control the underlying platform.

In an AI-mediated discovery environment, both types of territory matter. Owned land provides the canonical source of information. Rented land distributes that information across the broader ecosystem where AI systems encounter it.

These platforms are powerful environments for discovery, amplification, and conversation. They are often where audiences encounter brands for the first time and where ideas circulate widely. However, they rarely serve as the place where authority itself is established.

Authority tends to emerge from deeper forms of publishing: 

  • Long-form explanations.
  • Original analysis.
  • Research.
  • Consistent demonstrations of expertise over time. 

These forms of content typically live on first-party websites, where ideas can be developed fully and preserved as reference points. Rented platforms still influence how AI systems interpret information, but their role differs from that of first-party publishing. 

When a brand, concept, or explanation appears consistently across multiple environments — first-party sites, industry publications, social platforms, and other third-party mentions — the association between that entity and the idea becomes stronger.

Repeated exposure stabilizes the relationship between the brand and the concepts connected to it. As a result, the likelihood that the brand will be included in an AI-generated answer increases.

Platforms amplify the signal. First-party publishing is where the signal originates.

Dig deeper: How paid, earned, shared, and owned media shape generative search visibility

Why AI often favors primary sources

Another misconception in the zero-click discussion is the assumption that AI systems primarily rely on aggregated or repackaged information. In practice, the opposite often occurs. 

When AI systems generate answers, they frequently rely on sources that provide clear explanations, detailed reasoning, and subject-matter expertise. These characteristics are more common in original publishing than in aggregated content.

Legal blogs, technical documentation, research publications, and expert commentary often perform well in AI citations because they provide usable knowledge. The material contains context, reasoning, and structured explanations that models can extract and synthesize.

Aggregated summaries frequently lack that depth. Without detailed explanation or original analysis, the content provides limited value for AI systems attempting to construct coherent answers.

The result is a quiet shift in visibility. Domains that consistently publish authoritative explanations may become more influential in AI-generated answers, even if traditional click-based metrics decline.

The real shift you should understand

Websites still matter, but their role is changing. They’re no longer just traffic generators.

In an AI-mediated information ecosystem, websites function as knowledge sources, training signals, and citation anchors — where expertise is documented, and ideas originate.

Platforms distribute those ideas, conversations amplify them, and AI systems synthesize them into answers. The source of the underlying knowledge, however, still matters.

The marketing implication is straightforward. Success can’t be measured solely by clicks. The objective is to ensure that credible expertise exists in durable forms that can be discovered, referenced, and synthesized wherever information surfaces — whether in search results, AI-generated responses, or discussions on other platforms.

Content that is clear, authoritative, and genuinely useful will continue to shape the answers people receive. In a zero-click world, influence simply happens earlier in the information pipeline.

Dig deeper: Content marketing in an AI era: From SEO volume to brand fame

Why ‘search everywhere’ is the new reality for SEO

23 March 2026 at 18:00
Search everywhere is the real shift in SEO

Most SEO discussions today center on AI — from AI Overviews to ChatGPT and other LLMs — and the concern that they’re taking traffic from business websites, forcing a shift toward GEO or AEO.

For the most part, that concern is valid. AI is reducing traffic for many sites, especially those that rely on top-of-funnel, informational content. But the data suggests AI may not be the biggest shift.

User behavior has been fragmenting across platforms for years, and I see this play out in agency work every day.

Here’s what the data shows about how search behavior is changing across platforms, and why a “search everywhere” strategy matters more than focusing on LLMs alone.

Third-party platforms are encroaching on traditional search

People search TikTok for restaurants, YouTube for tutorials, Reddit for authentic reviews, and Amazon to buy products. In many cases, these platforms are replacing traditional search engines like Google and Bing as the starting point.

This shift isn’t just about behavior — it shows up in traffic, too. Amazon and YouTube still drive far more desktop traffic than ChatGPT, a trend Rand Fishkin recently highlighted.

Recently, I helped run a comprehensive share of voice analysis for a client. The goal was threefold:

  • See which competitors are winning in traditional search across multiple service lines.
  • Find keyword and content gaps.
  • Create a content roadmap based on priority to fill these gaps.

The analysis revealed a lot of helpful data, but one of the most interesting takeaways was that our core competitors weren’t actually our biggest competitors in traditional search. YouTube and Reddit were.

Share of voice - Client example

These platforms rank well in traditional search, take up valuable SERP real estate, and move users away from Google and Bing to funnel them back to their own platforms.

The analysis highlighted a key point: if you don’t focus any effort on these places, you’re not only missing out on visibility in traditional search, but you’re also missing valuable attention when users navigate off Google and start watching videos or reading threads.

And this website isn’t the only one seeing this type of trend. Do this type of analysis yourself, and see who your actual competitors are within traditional search. The answers may surprise you.

Dig deeper: Why social search visibility is the next evolution of discoverability

Third-party platforms can have higher search volumes

As seen above, platforms like YouTube and Reddit are increasingly occupying traditional SERP real estate. But what about searches within the platforms themselves? Depending on the query, there may be far more search volume on these platforms than on Google or Bing.

For example, YouTube dominates in tutorials and “how-to” content. A term like “how to fix a leaky sink faucet” has 15x the search volume on YouTube than it does on traditional search globally.

How to fix a leaky sink faucet - Semrush
Source: Semrush
How to fix a leaky sink faucet - vidIQ
Source: vidIQ

Search volumes are estimates. But if you want to get in front of the right people where they’re searching, any content strategy around a term like this, or a similar topic, must include creating a YouTube video.

Better yet, to be search-everywhere-friendly, create a blog post and embed that video in it.

Dig deeper: YouTube is no longer optional for SEO in the age of AI Overviews

Get the newsletter search marketers rely on.


Third-party platforms are cited more in LLMs

Aside from traditional search and in-platform search, we also know that “search everywhere” influences AI-generated results.

To provide answers, LLMs need content to synthesize. More often than not, that content isn’t coming from business websites, but from third-party sources and social platforms.

AI visibility tools can quickly show businesses the power of search everywhere in relation to citations. Take a look at these examples:

Brand A
Brand B

These are two completely different brands, yet the trends are the same: a very small percentage of citations come from your own website or even direct competitors.

In both examples, almost 90% of citations come from third-party news and online publications, or social and forum platforms like Reddit or Quora.

The takeaway here is that focusing on your own website, in the context of LLM citations, can only go so far. If you want to improve brand sentiment or ensure that information is accurately reflected by AI, it needs to happen in places outside of your direct control.

Dig deeper: SEO’s new battleground: Winning the consensus layer

Start investing in search everywhere today

The competitive landscape is shifting, and many marketers have tunnel vision when it comes to AI. Discovery now happens across a wide range of platforms.

YouTube, Reddit, Quora, and others dominate significant portions of traditional search results and may have far more search activity within their own platforms. When AI systems generate answers, they often pull information from these platforms rather than brand websites.

To win in modern search, you need to understand where your audience is actually searching. That doesn’t stop at Google. It means showing up everywhere that shapes decisions.

AI is squeezing marketing agencies from both sides

23 March 2026 at 17:00
AI promised efficiency, but it’s squeezing agency margins instead

The numbers tell a story that most agency owners already know in their gut: AI anxiety is rising fast.

In 2024, 44% of digital marketing agencies viewed AI as a significant threat to their business model. Just one year later, that number jumped to 53%, according to SparkToro’s annual State of Digital Agencies survey of hundreds of agency owners worldwide.

But here’s what makes this particularly painful: agencies aren’t just watching AI disrupt their industry from the sidelines. They’re actively using it themselves, automating tasks, reducing costs, and hoping to improve margins. All while their clients are doing the exact same thing, using AI to justify slashing budgets or bringing work in-house entirely.

It’s a squeeze play from both directions, and agencies are caught right in the middle.

The promise that became a problem

When AI tools like ChatGPT and Claude first exploded onto the scene, many agency leaders saw opportunity. 

Finally, a way to automate the repetitive, time-consuming work that ate into profitability. Content briefs, initial drafts, performance reports, basic ad copy, all could be accelerated or partially automated. The math seemed simple: use AI to do more work with fewer people, pocket the difference, and stay competitive on pricing.

Except clients did the same math — and they reached a different conclusion. When brands can spin up decent content, analyze campaign performance, or generate ad variations with a few prompts, the question becomes unavoidable: why are we paying an agency for this?

“Several services that agencies once charged a premium for are now performed in-house or by automation software,” notes Al Sefati, CEO of Clarity Digital Agency, who’s been vocal about the pressures facing boutique agencies. 

Earlier this year, Sefati had clients “put marketing on pause” despite strong performance metrics. A manufacturing client backed out of a contract entirely due to tariff uncertainty. When budgets get tight, and AI makes certain marketing tasks feel commoditized, agencies become an easy line item to cut.

The margin trap nobody talks about

Agencies adopt AI hoping to increase profits by doing more with less staff. But clients expect the cost savings to flow to them, not the agency’s bottom line.

The result? Shrinking retainers across the board.

SparkToro’s research shows that sales cycles are lengthening, more agencies now report deals taking 7-8 weeks or even 12+ weeks to close, up significantly from 2024. 

Prospects are taking longer to commit because they’re doing their own internal math: “If AI makes this cheaper and faster, shouldn’t we pay less?”

Meanwhile, client expectations haven’t decreased at all. In fact, they’ve intensified.

Progress is no longer good enough. Brands now demand tangible business outcomes, pipeline impact, revenue attribution, and demonstrable ROI on every dollar spent.

So agencies are stuck: use AI to stay efficient and risk commoditizing their own services, or refuse to adopt it and get outpaced by competitors and in-house teams who will.

Dig deeper: Why AI will break the traditional SEO agency model

The junior talent crisis nobody’s preparing for

Perhaps the most concerning finding from the research: 66% of agency owners worry that junior team members will have fewer career opportunities in the future. This goes beyond entry-level headcount to the entire talent pipeline.

Historically, agencies have relied on junior staff to handle the repetitive, foundational work, keyword research, content optimization, reporting, and campaign setup. These weren’t glamorous tasks, but they were essential training grounds. Junior marketers learned the craft by doing the work, eventually graduating to strategy and client leadership.

AI is rapidly automating precisely those tasks. And while that might seem like a net positive for efficiency, it creates a devastating long-term problem: where do future senior strategists come from if there’s no ladder to climb?

The war for senior talent is brutal. Top strategists, creatives, and media planners know their worth and demand premium compensation. Meanwhile, clients push back on fees.

The math doesn’t work unless agencies can maintain lean teams, which AI theoretically enables.

But five years from now, when those senior people retire or move on, who replaces them? If an entire generation of marketers never got hands-on experience because AI was doing the work, the industry risks hollowing itself out.

What AI can’t replace yet

Despite the disruption, there’s a clear pattern in what’s working for agencies weathering this transition.

The research shows that larger agencies (51+ employees) are reporting healthier sales pipelines than their smaller counterparts. Part of this is resources, larger shops have dedicated sales teams, and can absorb economic volatility better.

But there’s something else at play.

Agencies that are surviving, and in some cases thriving, are the ones who’ve stopped trying to compete on execution alone. They’re selling something AI can’t easily replicate: strategic thought, real-world market experience, nuanced storytelling, and intelligent execution tied directly to business outcomes.

“Clients desire teams that really understand their industry,” Sefati observes.

The trend is clear: specialization is no longer optional. Generalist “we do everything” agencies are struggling most. Those with deep vertical expertise, B2B SaaS, financial services, healthcare, and ecommerce, are proving that context and strategic insight still command premium fees.

This matters because AI is phenomenal at pattern recognition and execution within known parameters. But it struggles with the messy, ambiguous work of understanding a client’s competitive position, reading market dynamics, or crafting positioning that actually resonates with a specific audience.

The problem? Many agencies haven’t made this transition yet. They’re still selling and delivering services that feel interchangeable with what AI, or a capable in-house team with AI, can produce.

Dig deeper: What successful brand-agency partnerships look like in 2026

Get the newsletter search marketers rely on.


The uncomfortable truth about commoditization

A few years ago, simply having the technical skill to launch a Google Ads campaign or set up marketing automation gave agencies an edge. That’s no longer true.

As martech platforms have become more complex and AI tools grow faster, more brands have built competent internal teams. The bar for what counts as “differentiated agency value” has risen dramatically.

This is why the sales pipeline data is so revealing. 

  • Only 14% of agencies describe their current pipeline as “very healthy.” 
  • Over half say it’s just “average.” 
  • 32% admit it’s “not good.” 

These numbers have improved marginally from 2024 (when 36% said “not good”), but we’re talking about incremental gains in a fundamentally challenged environment.

Smaller agencies, those with 1-10 people, are hit hardest. They typically lack dedicated sales staff, so business development competes with client delivery for founders’ time. And when budgets tighten, brands consolidate with larger, more specialized agencies that feel less risky.

How your agency can escape the squeeze

Focus on these priorities as client demands rise and margins tighten.

Be honest about what AI has commoditized

Don’t fight AI or pretend it doesn’t exist. Be brutally honest about what AI has already commoditized, and ruthlessly focus on what it can’t replicate.

This means making some uncomfortable decisions now. Stop competing on services that AI handles well enough. If you’re still selling basic content creation, social media management, or standard reporting as core offerings, you’re volunteering to be price-shopped. 

Instead, double down on the work that requires genuine expertise: deep market understanding, strategic positioning, creative concepts that actually move the needle, and the kind of nuanced judgment that comes from having seen what works (and what fails spectacularly) across dozens of client situations.

Lead with AI, don’t hide from it

Change how you talk about AI with clients. Rather than downplaying it or treating it as a threat to hide, lead with it. 

  • “Yes, AI can generate content, and we use it to do that faster and cheaper than ever. But what AI can’t do is know that your competitors just shifted strategy, or understand why your last three campaigns underperformed despite good metrics, or recognize that your messaging is technically correct but completely misses what your audience actually cares about. That’s what you’re paying us for.”

Rethink pricing models

Hourly billing and retainers based on team size are relics of a world where labor hours correlated to value. They don’t anymore. 

Outcome-based pricing, value-based fees, and performance partnerships align agency incentives with client success, and make the AI efficiency gains work in your favor rather than against you.

Rebuild the talent pipeline

Address the junior talent crisis head-on. The agencies that figure out how to train the next generation of strategists in an AI-enabled world, by pairing them with senior experts on high-level work rather than relegating them to tasks AI now handles, will have a massive competitive advantage in five years when everyone else is scrambling for talent.

Dig deeper: How to work with your SEO agency to drive better results, faster

The old agency model isn’t coming back

The data shows 64% of agencies expect revenue growth over the next 12 months. Whether that optimism is justified depends entirely on whether agencies adapt to the new reality or keep hoping the old model comes back. It won’t.

The squeeze is permanent. But there’s a path through it for agencies willing to fundamentally rethink what they sell and how they deliver it.

Will your agency become indispensable because of how you use AI, or get bypassed entirely because clients realize they can do what you do themselves?

Duplicate website stats appear in Google paid search ads

23 March 2026 at 16:35
Google Ads may be over-crediting your conversions- A 7-day test tells a different story

A strange pattern has emerged in Google’s paid search results: multiple competing ads display the exact same web statistics, raising questions about a bug or an intentional design shift.

What’s happening. Several paid search ads are showing the same website statistics simultaneously, even though these signals are typically unique to each site. The uniformity makes the data look unreliable, and it’s unclear whether this is a display glitch, a test, or something more deliberate.

Why we care. Trust signals in search ads help users make informed decisions and boost click-through rates by building confidence. If those stats appear identical across competing ads, users may dismiss them as unreliable — undermining the credibility boost you rely on.

What we don’t know.

  • Whether Google is actively testing this or it’s an unintended bug.
  • How widespread the issue is across different search queries or markets.
  • Whether it’s affecting user click behavior or advertiser performance.

No official word. Google hasn’t confirmed or commented on the behavior. Paid media expert and founder Anthony Higman first spotted and flagged the anomaly on LinkedIn.

Bottom line. If trust signals can’t be trusted, they stop serving their purpose. You should watch whether this pattern spreads — or quietly disappears.

Google Ads account suspensions: What advertisers need to know

23 March 2026 at 16:00
Google Ads account suspensions- What you need to know

Account suspensions are essential to “maintain a healthy and sustainable digital advertising ecosystem, with user protection at its core,” according to Google Ads.

For advertisers, though, navigating the suspension process can be a minefield. Suspensions can happen suddenly, limit what you can do in your account, and, in some cases, affect related accounts as well.

Here’s what triggers account suspensions, the different types you might encounter, and what to do if your account is flagged or suspended.

Why do accounts get suspended?

Accounts get suspended when Google Ads finds a violation of one of its policies. The platform uses a combination of automated systems and manual reviews when detecting violations.

The process involves reviewing the account and other aspects, including your customer reviews, business practices, and website content.

In November 2025, Google addressed concerns that a large volume of accounts were being unfairly suspended by announcing that it had improved the accuracy of the system

Google says that, by using new processes and AI, it’s reduced incorrect suspensions by over 80% and improved resolution times by 70%, with 99% of suspensions now resolved within a 24-hour window.

Your customers search everywhere. Make sure your brand shows up.

The SEO toolkit you know, plus the AI visibility data you need.

Start Free Trial
Get started with
Semrush One Logo

How Google Ads suspends accounts and what happens next

Depending on the violation, accounts may be suspended immediately upon detection. In other cases, advertisers will be given a prior warning of at least seven days before the suspension takes place.

Advertisers will be notified via email, along with a red banner at the top of their Google Ads account. When an account is suspended:

  • Ads will not run.
  • You won’t be able to create any new content, such as ads, ad groups, or campaigns.
  • You can, however, still access the account to review historical data and reports.

In some instances, accounts related or linked to the suspended account may also be suspended, such as linked Merchant Center accounts or those linked to the same manager account. These will be lifted if or when the original suspension is resolved.

Dig deeper: Google Ads’ three-strikes system: Managing warnings, strikes, and suspension

What are the different types of account suspensions?

Not all suspensions are the same. Google Ads groups them into a few main categories, each with different causes and outcomes.

Policy violations

These suspensions are due to violations of Google Ads policy or its terms and conditions. Common examples include: 

  • Inappropriate or restricted content.
  • Issues related to editorial requirements.
  • Misuse of data. 

Egregious violations

These are suspensions that Google Ads deems unlawful or harmful. They typically reflect the overall practices of a business, not necessarily its campaigns or accounts. As such, it’s unlikely that the suspension will be overturned and will probably be permanent.

Common egregious violations include:

  • Circumventing systems.
  • Unacceptable business practices.
  • Malicious software.
  • Counterfeiting.
  • Illegal activities.

Other suspensions

Other reasons why an account may be suspended include:

  • Suspicious payment activity.
  • Unpaid balance.
  • Promotional code abuse.
  • Unauthorized account activity.
  • Failure to meet age requirements.

Get the newsletter search marketers rely on.


What to do if your account is suspended?

What you should do next depends on the type of suspension and what caused it.

Policy violations

If your account has been suspended for policy or terms and conditions violations, you must resolve the issue causing the suspension before submitting an appeal.

The Google Ads help guides contain detailed information on these policies, so make sure you read them thoroughly. Don’t submit an appeal until you’re certain that you’ve made the relevant changes.

For example, if you’ve been suspended for violating editorial requirements, review your ad copy to check for potential issues regarding capitalization, spacing, spelling, and symbols.

If you’re uncertain about the violations that caused the suspension and how to fix them, you can use the account troubleshooter beta to determine what steps need to be taken.

Head over to the Google Ads account suspensions overview page and follow the instructions.
Head over to the Google Ads account suspensions overview page and follow the instructions.

Egregious violations

Egregious violations are treated very seriously. In most cases, the suspension is permanent. However, if you genuinely believe that the suspension is baseless, then you can submit an appeal.

Make relevant changes to your account or business practices before you submit your appeal. This is important because egregious violations only get one chance to submit an appeal. Take the time to review your business practices honestly and make sure you’ve done all that you can to comply.

Unauthorized account activity

In the case of an “Unauthorized account activity” suspension, Google Ads has detected suspicious activity, and your account has been suspended to protect it.

This may be triggered due to recent changes to account access, an unusual increase in your ad spend, or if your ads are sending traffic to unfamiliar destinations.

You will need to:

  • Change your Google account password immediately.
  • Check for any unfamiliar devices signed in to your account.
  • Submit a compromised account form.

Other suspensions

In many of these cases, billing issues cause suspension, so check the billing section of your account. Ensure that billing information is accurate, your payment method is up to date, and recent payments haven’t been declined.

If your account has been suspended for a billing or payment issue, you must fix this within 30 days. You may also be required to complete the advertiser verification program to confirm your identity or business operations.

Verified advertisers show in the Ads Transparency Center, which plays a part in Google’s efforts to build a safe and positive experience.
Verified advertisers show in the Ads Transparency Center, which plays a part in Google’s efforts to build a safe and positive experience.

Best practices for submitting an appeal

While the specific steps you need to take will depend on the type of suspension your account is under and what caused it, there are some best practices for submitting your appeal:

  • Ensure that you’ve submitted your advertiser verification, as this will help the system verify your identity and business authenticity.
  • If you recognize that you’ve made an error, for example, opening a new account for a business when there was already a dormant account created before you joined, be upfront and honest about this information.
  • If you believe that the suspension has been made in error, then provide as much information, evidence, and context as possible.
  • While you’ll have a minimum of six months to submit an appeal, try to resolve the issue and submit your appeal as soon as possible. It can be very tricky to return to an account that was suspended years ago and accurately recall the steps that led to the suspension in order to address them.

Dig deeper: Dealing with Google Ads frustrations: Poor support, suspensions, rising costs

What happens after you submit an appeal

Unfortunately, many advertisers are reporting long wait times to hear back about their appeal. This means that you’ll need to be patient and wait for a response via email.

In the meantime, don’t submit additional appeals. Doing so will not increase the speed at which your appeal is addressed and may result in the suspension of your appeal process for seven days.

If your appeal is accepted and your account is reinstated

You can resume running your campaigns via Google Ads as usual.

Be aware of violating the same policy again in the future. Depending on the type of policy infringement, you may face permanent suspension for repeat violations.

If your appeal is denied

You may be eligible to submit another appeal, but you must make the relevant changes before you do so.

While there is no limit on the number of appeals you can make, if too many appeals have been made, they may not be processed.

For egregious violations

If your appeal is denied and you’re permanently suspended, you’ve been banned from using Google Ads. Creating any new accounts will also result in suspensions.

If you still have funds in your account, you’ll need to cancel your account to receive a refund.

Making sense of Google Ads account suspensions

Account suspensions are designed to help keep advertisers and users safe. They help keep dangerous and malicious activities off the platform, improving the Google Ads experience.

While finding out your account is suspended is frustrating, in most cases, there are steps you can take to resolve the issues behind the violation and have your account reinstated.

Before yesterdaySearch Engine Land

The latest jobs in search marketing

20 March 2026 at 22:46
Search marketing jobs

Looking to take the next step in your search marketing career?

Below, you will find the latest SEO, PPC, and digital marketing jobs at brands and agencies. We also include positions from previous weeks that are still open.

Newest SEO Jobs

(Provided to Search Engine Land by SEOjobs.com)

  • The Lead SEO at Rival Digital is a strategic leader responsible for guiding our SEO team, driving organic growth for our home services clients, and evolving our SEO program. This role involves mentoring the team, implementing effective SEO strategies, and integrating SEO best practices across all operations. The Lead SEO will refine our SEO framework, […]
  • Job Description Invivoscribe is an industry pioneer, dedicated to Improving Lives with Precision Diagnostics®. Invivoscribe has been the global leader in driving international standardization of testing and accelerating patient access to the newest and best cancer treatments for over 30 years. Headquartered in sunny San Diego, California with locations across the world, we offer a […]
  • Position Overview As an SEO specialist, you will be responsible for optimizing our home service clients’ portfolios for search engines and driving traffic to their websites. You will work closely with our chief strategist and content team to develop and implement effective SEO strategies that align with clients’ business objectives, increase brand visibility, and improve […]
  • Company Description August Ash, Inc. exists to drive growth and innovation in every partnership by building and supporting complex website and digital marketing strategies. Guided by our core values of Care, Grow Grit, Good Nature, and Clarity, we guarantee honest answers to tough questions. Summary August Ash is seeking a Senior Digital Marketing Strategist to […]
  • Job Description Job Title: Web Designer & Digital Marketing Specialist Location: Phoenix, AZ / Hybrid Job Type: Full-Time Experience Level: Mid-Level (2–5 Years) About FirstLine Road Solutions Founded in 2022, FirstLine Road Solutions has quickly become the partner, employer, and acquirer of choice in the towing and roadside industry. Today, we support 20 independently operated […]
  • Our Snooze Story We are Snooze, the OG brunch leaders who have never stopped flipping the script on breakfast, powered by culinary creativity, unmatched hospitality, and a passion for our communities. Our Snoozers bring their authentic selves to work every day.  This allows us to serve our Guests through genuine care and radical hospitality. Joining Snooze […]
  • About CompoSecure CompoSecure, a GPGI business (NYSE: GPGI), is the leading manufacturer of Premium Metal Payment Cards and also offers best-in-class Authentication and Digital Asset solutions. The Company’s offerings combine elegance, simplicity, and security to deliver exceptional experiences and peace of mind, enabling trust for millions of people around the globe. For more information, please […]
  • Description: This role can sit in our Hayward, CA, Santa Clarita, CA, or Farmington, MI locations. Job Summary We are seeking a strategic and hands-on Digital Marketing Manager to own and run all aspects of our marketing campaigns from planning through execution and optimization. This role will lead our digital presence across paid, owned, and […]
  • Head of Digital Marketing   About the Company Top-tier organization in the consumer services industry Industry Consumer Services Type Privately Held   About the Role The Company is seeking a Head of Digital Marketing to spearhead the development and execution of comprehensive digital marketing strategies. The successful candidate will be tasked with enhancing brand awareness, […]
  • At MERGE, we are Built Different. We are a marketing and technology agency purpose-built for the intersection of health and wellness—where human impact matters most. By weaving storytelling through technology, we move beyond traditional engagement to Whole Human Marketing. This approach recognizes that humans are multidimensional and complex, and uses AI to ensure every brand interaction […]

Newest PPC and paid media jobs

(Provided to Search Engine Land by PPCjobs.com)

  • The Senior Manager, Paid Search will be the primary architect of our Search marketing strategy. This leader will help build a program rooted in incrementality, omnichannel lift, and algorithmic efficiency. They will lead a team responsible for our Search, Shopping, PMAX and App campaigns, integrating MMM insights into tactical execution, and leading data-driven optimizations to […]
  • The Global Paid Media Specialist is responsible for the strategic execution, optimization, and performance scaling of paid digital campaigns across international markets. This role goes beyond campaign management — it owns multi-country activation strategy, localized messaging alignment, budget allocation across regions, and performance optimization across Google, Meta, and additional digital platforms. It works directly with […]
  • Role Overview As our Creative Strategist, you will be the driving force behind the ideation, strategy, and optimization of paid media creatives across platforms (Facebook, Instagram). You’ll collaborate closely with designers, video editors, copywriters, and media buyers to turn insights into creative concepts that drive customer acquisition, loyalty, and brand affinity. This role requires both […]
  • Are you the type of person who immediately checks out a new product after seeing your favorite influencer showcase it on TikTok, Snapchat, or YouTube? Do you love diving into the world of social media advertising, particularly on Instagram? Are you someone who thrives in a role that blends creativity with data‑driven decision‑making? If so, […]
  • At UnitedHealthcare, we’re simplifying the health care experience, creating healthier communities and removing barriers to quality care. The work you do here impacts the lives of millions of people for the better. Come build the health care system of tomorrow, making it more responsive, affordable and optimized. Ready to make a difference? Join us to […]

Other roles you may be interested in

Digital Marketing Manager 10x Health System (Scottsdale, AZ)

  • Salary: $110,000 – $120,000
  • Measure and report on the performance of all digital marketing campaigns against goals (ROI and KPIs).
  • Document and streamline digital marketing processes to scale the team and improve operations.

Paid Ads/Growth Manager, Robert Half (Hybrid, Atlanta Metropolitan Area)

  • Salary: $65,000 – $85,000
  • Manage, optimize, and scale paid campaigns across Google Ads (Search, Display, YouTube) and Meta Ads (Facebook/Instagram).
  • Continuously refine targeting, bidding strategies, and creative to improve CPL, conversion rates, and overall ROAS.

SEO Manager, Clutch (Remote)

  • Salary: $60,000 – $75,000
  • Execute day-to-day SEO tactics across multiple client accounts, ensuring alignment with predefined campaign objectives.
  • Implement optimization strategies, including technical SEO audits and recommendations.

Marketing Manager – SEO & GEO, Care.com (Hybrid, Austin Texas)

  • Salary: $85,000 – $95,000
  • Organic Growth: Build and execute the SEO roadmap across technical, content, and off-page. Own the numbers: traffic, rankings, conversions. No handoffs, no excuses.
  • AI-Optimized Search (AIO): Define and drive CARE.com’s strategy for visibility in AI-generated results — Google AI Overviews, ChatGPT, Perplexity, and whatever comes next. Optimize entity coverage, content structure, and schema to ensure we’re the answer, not just a result.

Digital Marketplace Manager, Venchi (Hybrid, New York, NY)

  • Salary: $120,000 – $130,000
  • Define and execute channel-specific and cross-marketplace strategies, balancing brand positioning, commercial performance, and operational efficiency.
  • Manage Amazon advertising across Sponsored Products, Brands, and Display campaigns.

Advertising Media Manager, Vetoquinol USA (Remote)

  • Salary: $100,000 -$110,000
  • Develop and implement strategic advertising plans for Etail (Ecomm/Retail) accounts.
  • Analyzing advertising performance data with related ROAS & TACoS evaluations.

Programmatic Advertising Manager, We Are Stellar (Remote)

  • Salary: $75,000
  • Manage the day-to-day programmatic campaign approach, execution, trafficking optimization, and reporting across the relevant DSPs for your clients.
  • Build and present directly to client stakeholders programmatic campaign performance, analysis, and insights.

Marketing Manager, Backstage (Remote)

  • Salary: $100,000 – $140,000
  • Manage and optimize campaigns daily across Meta Ads, Google Ads, and other kay partners
  • Own forecasting, pacing, budget allocation, and optimization for high-scale monthly budgets..

Demand Generation Manager, Shoplift (Remote)

  • Salary: $100,000 – $110,000
  • Design and execute inbound-led outbound campaigns—reaching prospects who’ve shown intent (visited pricing page, downloaded resources, engaged with content) at precisely the right moment
  • Build and optimize Apollo sequences, LinkedIn outreach, and multi-touch campaigns that book qualified demos for AEs

Search Engine Optimization Manager, Confidential (Hybrid, Miami-Fort Lauderdale Area)

  • Salary: $75,000 – $105,000
  • Serve as a strategic SEO partner for client accounts, translating business goals into actionable search initiatives
  • Communicate SEO insights, priorities, and performance clearly to clients and internal stakeholders

Note: We update this post weekly. So make sure to bookmark this page and check back.

Google launches Ads DevCast Vodcast for developers

20 March 2026 at 22:28
Click fraud in Google Ads: Where exposure rises and how to reduce it

As AI agents reshape how advertising platforms are used, Google is bringing focus toward the developers behind the systems and create content specifically for them.

What’s happening. Google’s Advertising and Measurement Developer Relations team has launched Ads DevCast, a bi-weekly vodcast and podcast hosted by Cory Liseno. The show focuses on technical deep dives across Google Ads, Google Analytics, Display & Video 360 and related tools.

Zoom out. This is a companion to Ads Decoded, hosted by Google Ads Liaison Ginny Marvin, which focuses on campaign strategy. Ads DevCast is explicitly built for developers and technical practitioners.

Driving the news. Episode 1 — “MCPs, Agents, and Ads. Oh My!” — centers on what Google calls the “agentic shift,” where AI agents are becoming primary users of advertising APIs.

Why we care. Ads DevCast gives developers a direct line to the engineers building Google’s ad tools, which should help stay ahead of technical changes, discover new capabilities faster, and build more efficient integrations in an increasingly AI-driven ecosystem.

The big picture. AI is expanding who can work with ad tech systems. Google is seeing a shift from a narrow “Ads Developer Community” to a broader “Ads Technical Community,” where marketers can execute technical tasks without full development cycles.

What’s next. Ads DevCast is a pilot, and Google is collecting feedback to shape future episodes.

Bottom line. Google is positioning Ads DevCast as a tool to give developers a front-row seat to Google’s latest ads innovations, with practical insights to build, test, and adapt faster in an AI-first landscape.

Google tightens rules on out-of-stock product pages

20 March 2026 at 21:51
Google Shopping Ads - Google Ads

A new Google Merchant Center update changes how e-commerce sites must handle out-of-stock products, with direct implications for product approvals and ad performance.

What’s happening. Google now requires that out-of-stock products must still display a buy button, but it can no longer be active or hidden. Instead, the button must be visibly disabled and appear grayed out. In other words, users should be able to see the button, but not click it.

This marks a clear shift from common practices where retailers either left the “Add to Cart” button clickable or removed it entirely. Both approaches are now non-compliant.

How it works. In practical terms, the requirement is simple. The buy button must remain on the page, but its functionality needs to be turned off. Typically, this is done by applying a disabled state so the button becomes unclickable and visually subdued.

The catch. The button change is only part of the update. Google also expects clear availability messaging on the product page, such as “in stock,” “out of stock,” “pre-order,” or “back order.” This information must match exactly with what is submitted in the product feed.

Any inconsistency between the page and the feed can lead to disapprovals.

The bigger shift. This update removes a long-standing workaround used by many retailers. Previously, it was possible to keep selling out-of-stock products by leaving the purchase button active. That approach is no longer allowed.

If a retailer still wants to accept orders for unavailable items, the product must now be labeled as “back order.” This status needs to be reflected consistently across both the landing page and the feed.

Bottom line. What looks like a small UI requirement is actually a meaningful policy change. Retailers will need to review how they manage out-of-stock products and ensure their pages and feeds are fully aligned to avoid disruptions.

First seen. This update was spotted by Google shopping specialist who shared the his how to video on LinkedIn.

Dig deeper. About landing page requirements

Google Business Profile tests AI-generated replies to reviews

20 March 2026 at 21:35
Google AI reviews

Google is testing AI-generated review replies in Google Business Profile.

Why we care. Responding to reviews can impact conversions and trust. But generic AI replies could be risky and erode trust, especially on negative reviews where authenticity matters most. Response quality matters more than whether a business replies to reviews.

What it looks like. Here’s a screenshot:

The details. Google appears to be rolling out a limited test of Reply to reviews with AI inside Google Business Profile.

  • The feature generates suggested responses to customer reviews.
  • Users can review, edit, and manually submit replies.
  • Availability is inconsistent across accounts and reviews.
  • The feature has been spotted in the U.S., Brazil, and India, but not widely in Europe.

Early behavior. Some users report prompts focused on older, unanswered negative reviews.

  • In at least one test, users could trigger AI responses in bulk.
  • There are conflicting reports on automation — some users say bulk responses still require review; others report fully automated replies can be published without edits.

First seen. The feature was first shared on LinkedIn by Chandan Mishra, a freelance local SEO specialist, and amplified by Darren Shaw, founder of Whitespark.

Google confirms AI headline rewrites test in Search results

20 March 2026 at 20:56
Google rewriting titles

Google is testing AI-generated headline rewrites in Search results, describing it as a small, narrow experiment for now.

What’s happening. Google confirmed to The Verge (subscription required) that it’s testing AI-generated titles in traditional Search results, not just Discover.

  • The test is “small” and “narrow,” and not approved for broader rollout.
  • It impacts news sites but isn’t limited to them.
  • The goal is to better match titles to queries and improve engagement, Google said.

One example showed Google replacing original headlines with shorter or reworded versions, sometimes changing tone or intent (e.g., reducing “I used the ‘cheat on everything’ AI tool and it didn’t help me cheat on anything” to “‘Cheat on everything’ AI tool.”).

Why we care. Google Search is already sending fewer clicks. Now you also have to contend with Google generating entirely new headlines with AI, risking changes to meaning, brand voice, and click-through rates.

Dig deeper. Google changed 76% of title tags in Q1 2025 – Here’s what that means

What they’re saying. Sean Hollister, senior editor at The Verge, wrote:

  • “This is like a bookstore ripping the covers off the books it puts on display and changing their titles. We spend a lot of time trying to write headlines that are true, interesting, fun, and worthy of your attention without resorting to clickbait, but Google seems to believe we don’t have an inherent right to market our own work that way.”

Title links. According to the Google Search Central section on title links, originally published in 2021:

Google’s generation of title links on the Google Search results page is completely automated and takes into account both the content of a page and references to it that appear on the web. The goal of the title link is to best represent and describe each result.

Google said it uses these sources to “automatically determine title links”

  • Content in <title> elements
  • Main visual title shown on the page
  • Heading elements, such as <h1> elements
  • Content in og:title meta tags
  • Other content that’s large and prominent through the use of style treatments
  • Other text contained in the page
  • Anchor text on the page
  • Text within links that point to the page
  • WebSite structured data

What to watch. Google called this one of many routine experiments, but that’s no guarantee it stays small. The Verge noted a similar “experiment” in Discover later became a full feature.

  • Any future launch may not rely on generative AI, but Google didn’t explain how that would work.

Reaction. After seeing this news, Louisa Frahm, SEO director at ESPN, wrote on LinkedIn:

  • “After 10+ years in news SEO, I’ve come to find that a headline is the most prominent element for attracting readers in timely windows, to provide a targeted synopsis that elevates your brand voice. If that vision gets altered and facts are misrepresented, long-term audience trust will be compromised.”

Could AI eventually make SEO obsolete?

20 March 2026 at 19:00
Could AI eventually make SEO obsolete?

AI won’t make SEO obsolete, but it’ll change how the work gets done. There’s a growing concern that as AI systems improve, they’ll replace the need for human SEO analysis entirely. Early experiments suggest otherwise.

While AI can assist with technical tasks and even generate usable outputs, it still depends heavily on detailed human input, structured data, and technical oversight to produce meaningful results.

The real shift is toward redistribution. AI is accelerating parts of the workflow, raising the bar for execution, and changing where human expertise matters most.

Why AI hasn’t made SEO obsolete

AI aims to reduce the need for semi-technical expertise. Where data is highly structured (e.g., coding a Python script), it has an advantage.

Even then, human expertise is still required. AI can generate scripts, but without detailed instructions and debugging, the output is often unusable.

Generative AI can produce working functions with strong prompts, but it still “thinks” like a machine. That’s why technical practitioners are best positioned to get the most from it.

Technical knowledge is also required for AI-assisted SEO tasks like generating product descriptions or alt text at scale. Even with tools like OpenAI’s API, you still need to transform and structure data into rich, usable prompts — for example, turning Product Information Management data into prompt-ready inputs.

AI depends on human instructions, and output quality reflects input quality. Thinking in structured terms — IDs, classes, and distinct entities — is key to getting reliable results. It’s what makes the output usable.

That makes prompt creation a critical skill. Employers should factor in technical expertise when using AI to drive efficiency.

However, don’t celebrate too soon.

As AI evolves and absorbs more information, this advantage may be temporary. For now, AI still depends on human expertise to function — which is why SEO isn’t obsolete.

Where AI struggles without human input

Data is both AI’s strength and weakness.

Early generative AI models relied on curated data within their LLMs. OpenAI’s models couldn’t perform web searches up to and including GPT-4. After GPT-4, AI systems began relying less on internal data and more on web searches for fresh information.

Because the web isn’t curated and contains a lot of misinformation, this initially represented a step backward for most AI tools, including ChatGPT and Gemini. This shift also mirrors how traditional algorithms rely on raw information.

This raises a key question: Is more information always better for AI?

The open web contains both empirical data and subjective opinion, and AI often can’t distinguish between the two. Giving it access to uncurated data has arguably caused more errors and issues in its outputs.

Finding the right balance of data remains a challenge. How much data helps or harms performance, and how much curation is needed? While developers continue refining LLMs and connected systems, users still need to load up prompts with as much detail as possible to offset how AI sources and evaluates information.

These limitations highlight a core issue: without structured input and human judgment, AI struggles to produce reliable SEO insights.

Dig deeper: 6 guiding principles to leverage AI for SEO content production

Why full SEO automation is harder than it sounds

Basic AI tools can assist with SEO tasks, but full automation is far more complex than it sounds.

That said, AI platforms and technologies are evolving rapidly. The first wave of this evolution came as organizations began producing AI agent platforms like Make, N8N, and MindStudio.

These platforms provide a canvas for automating workflows, combining inputs, outputs, and AI-driven decision-making. Used well, they can turn from-scratch content creation into structured editorial processes, with efficiency gains that can be significant.

However, applying this to real-world SEO work is where complexity sets in. A full technical SEO audit pulls from multiple data sources and environments — crawl data, browser-level diagnostics, and desktop tools. 

While parts can be automated, stitching everything together into a reliable, end-to-end workflow is difficult and often requires custom infrastructure, API work, and ongoing maintenance.

Even with platforms like N8N, full end-to-end automation of complex SEO tasks remains challenging. Simpler, checklist-style audits can be automated, but deeper, more technical work often needs to be simplified to fit automation — which isn’t advisable.

In practice, fully automating SEO at depth requires tradeoffs — which is why human expertise is still critical.

Dig deeper: AI agents in SEO: A practical workflow walkthrough

Get the newsletter search marketers rely on.


AI tools are advancing — but not replacing SEOs

More recently, there’s been a wave of local AI applications that let you create your own “brain” on a laptop or desktop. These tools are often code editors with support for popular AI models, along with local structures for saving reusable skills, similar to Claude Projects or ChatGPT Custom GPTs.

Tools like Cursor and Claude Code allow you to connect models, generate code, and automate parts of workflows through prompts.

It’s possible to use these technologies to vibecode a system that automates a technical SEO audit. I attempted this. While the capability exists, building a system that matches the depth and quality of a manual audit could take months, especially when handling large volumes of data.

Initial issues included memory limitations, where AI struggled to retain both the data and its detailed instructions. In some cases, outputs were also misweighted — for example, flagging missing H1s as critical despite finding no instances.

These issues could be resolved over time, but they highlight that these tools aren’t automatic shortcuts. Making effective use of them still requires technical expertise, time, testing, and troubleshooting.

They lower the barrier to building AI-driven systems, but they don’t eliminate the need for technical expertise. They simply shift the work.

What would need to change for SEO to become obsolete

For SEO to become obsolete, AI would need to operate independently, reliably, and at scale — without human correction. Generative AI can only act with human input, and it struggles to differentiate between fact and fiction.

Some algorithms have reached their limits in terms of commercial viability. This is arguably why Google is trying to convince us that links are redundant before they truly are.

Consider AI as an evolution of algorithmic output. These systems can attempt to make analytical determinations based on input data. However, the idea that feeding AI more and more data is an unrestricted path to success is already running into significant limitations.

This doesn’t mean technical analysts are entirely safe. Humanity’s ambition for faster, more efficient insights will continue. Initially, AI will be seen as the solution to everything. If one AI falls short, another can critique its results.

However, AI requires significant processing power. The real challenge will be finding the balance between AI and simpler algorithms. Algorithms should handle basic tasks, while AI should be used for analysis and insights.

This balance between AI and algorithmic efficiency is still years — perhaps decades — away. Only then will AI truly test SEO professionals and create the potential for redundancies.

AI’s learning is hindered by the web’s misinformation, providing SEO professionals with temporary insulation. This advantage won’t last forever, but it offers a valuable head start.

Dig deeper: How AI will affect the future of search

AI adoption won’t make SEO obsolete overnight

There are also limitations tied to how society adopts AI. Many technological innovations — like the internet and the calculator — were initially considered “cheating.”

Calculators were banned from exam rooms, and the internet was seen as a shortcut compared to traditional research. Yet those perceptions didn’t last.

Most technologies, despite rapid advancement, aren’t adopted quickly due to cost and social factors. We value human perspective and often resist tools that threaten how we think or work.

The main barrier to AI replacing us is how we perceive it. As long as it’s seen as a threat to our ability to provide, it won’t fully replace human roles. That perception, however, will change over time.

As these technologies become normalized, adoption will follow. Governments will adapt, and expectations around human creativity will continue to evolve.

Algorithms and Google didn’t end human interaction on the web, and AI won’t eliminate contributions from people. In the medium to long term, adaptation is inevitable.

SEO and AI: Technical expertise still matters

  • AI integration with SEO: Contrary to fears, AI won’t make SEO obsolete. Instead, it will reshape how SEO is practiced. AI can automate routine tasks like generating product descriptions and alt text, but its effectiveness still depends on precise, technically sound input.
  • Importance of technical expertise: The ability to craft detailed, technically sound prompts is becoming more valuable. This ensures AI tools are used effectively and reinforces the role of experienced SEO professionals.
  • Data sensitivity in AI performance: AI performance varies significantly depending on the data it processes. Systems using curated datasets behave differently from those relying on open web data. This highlights the importance of data strategy and structured oversight.
  • Evolving roles in SEO: As AI advances, SEO roles are shifting. Professionals are more likely to focus on managing AI systems and refining outputs rather than being replaced by them.
  • Societal acceptance and adaptation: Widespread adoption of AI in SEO depends on how quickly society embraces these tools. As normalization and regulation evolve, so will the role of SEO professionals.
  • Future outlook: Despite AI’s capabilities, the creative, strategic, and complex aspects of SEO still require human insight. The future of SEO is a collaboration between human expertise and machine efficiency.

Dig deeper: How to start an SEO program from scratch in the AI age

Cloudflare CEO: Bots could overtake human web usage by 2027

20 March 2026 at 18:52
AI vs human internet traffic

AI bots could outnumber humans on the web by 2027, according to Cloudflare CEO Matthew Prince, as agent-driven browsing explodes alongside generative AI adoption.

  • Prince made the prediction at SXSW, warning that bots are already reshaping how the internet is used — and how it’s monetized.

Why we care. Search is shifting from human clicks to AI-generated answers. If bots become the web’s primary “users,” you’ll need to reshape your strategy to ensure AI systems can access, trust, and use your content.

The details. Prince said AI agents generate far more web activity than humans because they gather information differently. A person shopping might visit five sites. An AI agent could hit thousands.

  • “If a human were doing a task… you might go to five websites. Your agent… will often go to a thousand times the number of sites.”
  • “So it might go to 5,000 sites. And that’s real traffic, and that’s real load.”

He also noted the web’s baseline is shifting fast.

  • “For a long time, the internet was about 20% bot traffic.”
  • “We suspect that in 2027 the amount of bot traffic online will exceed the amount of human traffic.”

Prince said this growth isn’t spiking like COVID-era traffic. It’s rising steadily with no end in sight.

Between the lines. Prince compared AI to past shifts like mobile and social. The difference: users may no longer visit websites directly. Instead, they rely on AI interfaces that aggregate and answer.

  • “The business model of the internet was… create content, drive traffic, and then sell things… That was the business model.”
  • “That breaks down because… bots don’t click on ads.”
  • “Customers are trusting the output from the helpful robot. They’re not clicking through the footnotes.”

AI sandboxes. AI agents also change how computing works behind the scenes. Prince described a future where “sandboxes” — temporary environments for AI agents — spin up and shut down instantly, potentially millions of times per second.

  • “You can… as easily as you open a new tab in your browser… spin up new code which can then run and service the agents.”
  • “We think that there will be literally millions of times a second these sort of sandboxes… being created… and then torn back down.”

The result: sustained pressure on internet infrastructure.

  • “We’re seeing internet traffic grow and grow and grow. And we don’t see anything that’s going to slow it down or stop it.”

The business impact. Companies are already split on how to respond to AI agents. Prince pointed to diverging strategies across major retailers.

  • “There are three radically different strategies about how they are going to interact with the bots.”

At the core is a bigger risk: losing the customer relationship.

  • “The nature of bots is going to be that it disintermediates the relationship between you and your customer.”
  • “Agents… don’t care about brand.”

For publishers. Prince argued AI could both hurt and help media. While AI reduces direct traffic and breaks ad-based models, AI companies need unique, original data — especially local and hard-to-replicate information — and may pay for it.

  • “Traffic has always been a really bad proxy for value.”
  • “What they actually want is… unique local interesting information they can’t get elsewhere.”

He pointed to local media as an example.

  • “If you don’t have the Park Record, then you don’t get that information.”
  • “We may make more off licensing our content to AI companies than we do off digital advertising.”

For small businesses. Prince was more blunt. AI agents optimize for price, quality and efficiency — not brand loyalty or proximity.

  • “My bot doesn’t care.”
  • “My bot is going to figure out actually who is the best… and route that traffic.”

That could erode traditional advantages.

  • “The shortcuts of trust that small business had in the past… are going to be much more difficult.”
  • “The natural tendency of AI is towards that level of aggregation.”

What to watch. The next phase of the web will hinge on control and compensation. Prince said:

  • “There has to be some exchange of value.”
  • “We’ve got to figure out… what’s going to pay for it.”

Prince said the core question is still unresolved:

  • “What is the future business model of the internet?… I don’t know what it’s going to be, but it’s going to change.”

The SXSW interview. The Internet After Search

💾

Matthew Prince says AI bots could soon surpass humans, driving massive traffic surges, breaking ad models, and reshaping search.

SEO’s new battleground: Winning the consensus layer

20 March 2026 at 18:00
SEO's new battleground- Winning the consensus layer

You could be ranking in Position 1 and still be completely invisible.

I know that sounds counterintuitive. But here’s what’s actually happening:

A potential customer opens ChatGPT or Perplexity and asks, “What’s the best [tool/agency/platform] for [your category]?” Your competitor gets mentioned. You don’t. Your No. 1 ranking did absolutely nothing to help you.

This is the new SEO reality, and it’s catching many smart marketers off guard.

LLMs synthesize consensus across multiple sources, rather than relying on a single source. This means you need corroborating mentions distributed across the web. The game has shifted from ranking to consensus, and if you don’t understand that difference, you’re already losing ground.

Let me break down what’s actually happening and, more importantly, what you can do about it.

From rankings to consensus: What changed and why

Traditional SEO had a clear logic: rank high, get clicks, drive traffic. In this retrieval-based system, Google found pages and users chose which ones to visit.

AI-driven search doesn’t work that way. Systems like Google’s AI Overviews, ChatGPT, and Perplexity are now constructing answers. They pull from dozens of sources, identify which claims appear consistently across credible publishers, and synthesize a single response. 

The data backs up just how significant this shift is: organic CTRs for queries featuring AI Overviews have dropped 61% since mid-2024. Even on queries without AI Overviews, organic CTRs fell 41%. Users are simply clicking less, everywhere.

The technical engine behind this is retrieval-augmented generation (RAG). The AI retrieves content from across the web, gathers potentially dozens of sources, identifies the claims that repeat most consistently across credible publishers, and generates a response based on that consensus.

Your goal isn’t just to publish a great page. It’s to be one of those sources. Repeatedly.

What the consensus layer actually is

Think of the consensus layer as the degree to which multiple AI systems produce consistent, repeatable outputs about your brand. It’s about pattern recognition at scale.

When AI systems encounter your brand described the same way across multiple credible sources, in the same category, with the same expertise, and with the same problems you solve, they build confidence. When they don’t see that pattern? You become a statistical outlier, and outliers get filtered out.

This happens because AI systems are engineered to prevent hallucinations. Their primary defense is corroboration: if multiple independent sources say the same thing, the AI assigns higher confidence to that claim. If only one source says it, the AI can become cautious or ignore it entirely.

This creates a rule most marketers haven’t fully internalized yet: isolated authority isn’t enough. You need distributed credibility.

I’ve seen this firsthand. A client ranking first for a competitive keyword, with solid traffic and strong domain authority, was invisible across ChatGPT. Why? Because that page existed in isolation. No corroboration, no distributed mentions, no external validation. 

As Will Scott wrote: “Brands aren’t losing visibility because they dropped from position three to seven. They’re losing it because they were never cited in the AI answer at all.”

Dig deeper: The infinite tail: When search demand moves beyond keywords

The signals that actually build consensus

So what signals do AI systems actually use? Here’s where to focus your energy.

Traditional authority is table stakes, not a finish line

Backlinks, domain authority, and topical depth remain foundational. But they’re no longer sufficient on their own. They get you in the game; consensus is what wins it.

Unlinked brand mentions matter more than most marketers realize

AI systems scan the web for brand references, even when those mentions aren’t linked. Unlinked mentions are growing in importance as signals for both traditional search and AI visibility. A mention in an industry publication with no link is still a consensus signal.

Nearly 9 out of 10 webpages cited by ChatGPT appear outside the top 20 organic results for the same queries, per a Semrush study. This tells you everything you need to know about how different this game is.

Publisher diversity signals broader credibility

Being mentioned repeatedly on the same domain doesn’t build consensus. Being mentioned across a range of credible, independent publishers does.

Diversity tells AI systems your authority isn’t contained to one corner of the web. It’s recognized broadly across your industry.

Community platforms are consensus gold

Reddit, Quora, and niche forums are becoming major consensus signals. AI systems increasingly pull from community discussions because they represent real user opinions and experiences. 

With Reddit dominating the SERPs, positive brand mentions in relevant subreddits contribute meaningfully to how AI systems perceive you. You can’t fake your way into genuine community trust, you have to earn it.

Entity clarity makes retrieval easier

Search engines use knowledge graphs to understand entities and how they relate to each other. If your brand is inconsistently described across platforms or your category is ambiguous, AI systems struggle to incorporate you into their answers. 

Structured data, schema markup, and JSON-LD are critical here. Google has explicitly stated that “structured data is critical for modern search engines.” The clearer your entity profile, the easier it is for AI to retrieve and cite you.

Get the newsletter search marketers rely on.


How to actually build consensus

Alright, let’s get tactical. Before you start building, you need to know where you stand.

Start with an LLM audit

Open ChatGPT, Perplexity, Gemini, and Google AI Overviews, and start asking questions the way your customers would. 

  • “What’s the best [tool/service] for [problem you solve]?” 
  • “Who are the leading [your category] providers?” 
  • “What do people say about [your brand name]?”

Pay attention to three things: 

  • Is your brand mentioned at all? 
  • If it is, is the information accurate and up to date? 
  • How are you being described relative to competitors? 

You may find outdated information, missing context, or, worse, a competitor owning the narrative in your category entirely.

This audit becomes your baseline. It tells you what gaps to close, what misinformation to correct, and where your consensus footprint is weakest. Only once you know that, should you start building.

Establish your owned media foundation

Your site needs to be technically sound and semantically clear. Use structured data. Establish explicit entity definitions, who you are, what you do, and what problems you solve. Reinforce those same entities and relationships across multiple pages within your site. 

Topic clusters, pillar pages supported by related subtopic content, create semantic reinforcement that signals depth and expertise. Without a strong foundation, nothing else sticks.

Treat earned media as consensus amplification

Press coverage, guest posts, podcast appearances, and expert citations distribute your authority across the web. More than links, digital PR is now about narrative control. 

One placement won’t move the needle. A sustained, coordinated presence across trusted publications will. Monitor your brand-to-links ratio, unlinked mentions alongside traditional link building is now the balanced strategy to pursue.

Publish original research

This is the highest-leverage consensus tactic most brands are underinvesting in. When you create genuinely novel data, an industry benchmark, a proprietary survey, original research, other publishers reference it naturally, journalists cite it, and AI systems incorporate it into answers. Establish yourself as the source for benchmark data in your niche, and you’ll earn citations for years.

Invest in expert-led content

AI systems are trained on vast amounts of text, including articles, research, and interviews. When your team members are consistently positioned as recognized experts, quoted in articles, cited in reports, and contributing bylined pieces, they become recognized entities that AI systems trust. Optimize author profiles with structured data, consistent bylines, and entity markup to reinforce this.

Participate genuinely in communities

This doesn’t mean dropping links in Reddit threads. It means answering questions, contributing knowledge, and building a reputation where your audience already hangs out. 

When users recommend your brand organically because they find it genuinely valuable, that’s your strongest consensus signal.

Dig deeper: Why surface-level SEO tactics won’t build lasting AI search visibility

Measuring what actually matters now

Traditional rankings tell you where you stand in search results. They don’t tell you whether AI systems are citing you. You need new metrics, and as more SEOs are recognizing, success metrics are shifting from clicks and traffic to visibility and share of voice.

Start by systematically testing high-value queries across Google AI Overviews, ChatGPT, Perplexity, and Gemini. Note when your brand appears, how it’s described, and which sources get cited alongside you. 

Track share of voice in AI responses, how often your brand gets mentioned relative to competitors in AI-generated answers. If competitors are consistently appearing and you’re not, you’re losing the consensus battle regardless of how your rankings look.

Also monitor cross-domain mention density (how many unique domains reference your brand) and entity co-occurrence (how often your brand appears alongside relevant topics, competitors, and concepts). These give you a real picture of your consensus footprint and where the gaps are.

The new SEO playbook

The brands winning in AI-driven search aren’t necessarily the ones with the best content or the highest domain authority. They’re the ones building distributed credibility, authority that appears consistently across owned media, earned media, and community platforms.

As Google’s Danny Sullivan said, “Good SEO is good GEO.” The fundamentals haven’t disappeared, but they’re now table stakes, not differentiators. The new formula is: authority + consensus + distribution.

Integrate SEO, digital PR, and community engagement into one cohesive strategy. Building a distributed network of authority, mentions, citations, and community validation that takes time to construct, and is nearly impossible for competitors to dismantle overnight.

That’s the visibility moat worth building, and the clock is ticking.

Dig deeper: Content alone isn’t enough: Why SEO now requires distribution

Adobe to shut down Marketo Engage SEO tool

20 March 2026 at 17:30
Adobe logo

Adobe will shut down the SEO feature in Marketo Engage at the end of March 2026, according to its February 2026 release notes.

The tool will be deprecated on March 31, and you must export any existing SEO data before then. (This page includes links to the export instructions.) The SEO tile will be removed from the platform on April 1.

What happened?

Adobe’s Keith Gluck said deprecating low-use features lets the Marketo Engage team focus on other areas of the platform. For your SEO needs, Adobe announced in 2025 that it was acquiring Semrush, a full-featured SEO and visibility tool. (Reminder: Semrush owns Third Door Media, the publisher of Search Engine Land.)

The deprecation came as no surprise if you follow Marketo news closely. Reports suggest few people fully configured the SEO tool, and its features didn’t seem to be a priority for the Marketo Engage product team in recent years.

With LLMs rapidly changing the search landscape, it was time to say goodbye. The arrival of Semrush into the Adobe family provided the perfect opportunity.

Why your law firm’s best leads don’t convert after research

20 March 2026 at 17:00
Why your law firm’s best leads don’t convert after research

If your law firm’s referrals aren’t converting, validation may be the problem.

Referred prospects don’t go straight from recommendation to contact. They research, compare, and verify what they were told — on your website, in search results, and through AI tools.

These are your highest-value leads — pre-sold through trusted recommendations and expected to be your easiest conversions. But when that validation falls short, even they lose momentum. 

This is the referral validation gap: the moments during online research when trust is broken rather than built. Here’s where referral validation fails and how to fix it.

While this article focuses on law firms, the same dynamics apply to any referral-based business.

The four types of referral validation failure

Referral loss follows predictable patterns — and once you can spot them, you can fix them.

  • Credibility gaps: When your digital presence doesn’t match the expectations set by the referral.
  • Specificity gaps: When your content doesn’t reflect the specific problem the prospect was referred for.
  • Authority gaps: When third-party or AI validation fails to confirm your expertise.
  • Friction gaps: When prospects are ready to act but encounter unnecessary barriers to conversion.

1. Credibility gaps

In under three seconds, a website visitor forms a first impression. If your site doesn’t immediately validate what the referrer said about you — if it looks outdated, generic, or fails to showcase the specific expertise they praised — that trust becomes conditional.

A referred prospect arrives expecting professionalism, confidence, and authority, only to encounter uncertainty. Thin attorney bios, generic claims (“experienced,” “trusted,” “results-driven”) without proof, or outdated design can all create hesitation.

The referral earned you consideration. Your digital presence determines what happens next.

The prospect’s reaction is simple: This doesn’t look like what I was expecting. That moment of doubt is often enough to end the process.

What you can do about it

Implement practice area-specific landing pages with targeted H1s, schema markup for your specialties, and prominent visual trust signals (credentials, case results, awards) above the fold. Ensure mobile page speed stays under two seconds with Core Web Vitals optimization.

2. Specificity gaps 

Referrals are almost always problem-specific. The website they’re referred to rarely is.

Imagine a prospect referred for a complex custody dispute lands on a homepage about “family law.” A business owner referred for a ground lease negotiation sees “commercial real estate services.”

Nothing is technically wrong. But nothing confirms the recommendation. When a site fails to mirror the exact issue that prompted the referral, the prospect starts to question it: Does this firm actually specialize in my problem, or was the referral overstated?

At the same time, prospects are actively looking for proof — case results, credentials, relevant experience. If that evidence is buried, disconnected, or requires more than two clicks to find, momentum drops quickly.

What you can do about it

Create practice area-specific case study pages with structured data markup. Implement FAQ schema tied to common referral scenarios. Ensure content directly reflects the search intent behind the referral, and use internal linking to guide visitors from homepage → specific expertise → proof points within two clicks.

3. Authority gaps

Referral prospects are asking questions like: “Is this firm actually good at complex custody cases?” or “Do they have experience with ground lease negotiations in New York?” — increasingly through AI search tools.

If AI tools can’t find credible, structured information on your site to validate the referral, they won’t confirm it. And if competitors provide clearer answers, those are the sources AI will surface. This creates an immediate form of negative validation. The prospect starts to question the recommendation: If they’re so good, why aren’t they showing up here?

If a competitor has invested in content that’s structured for citation, the AI will quote them, reference their work, and position them as the authority, even though the prospect came to you through a trusted referral. You can’t claim authority. AI systems will either confirm or contradict it.

What to do about it

Forward-thinking firms are now monitoring a new metric: AI search share of voice — the percentage of relevant AI-generated answers that mention or cite your firm compared to competitors. Start by:

  • Identifying the 10-15 questions prospects most commonly ask about your practice areas.
  • Running those queries regularly through ChatGPT, Perplexity, and Google AI Overviews.
  • Documenting which firms appear, how often, and in what context.
  • Tracking whether you’re cited as a source, mentioned, or absent entirely.

If your firm’s content, credentials, and case results aren’t structured for AI parsing and citation, you’re invisible in these crucial validation moments regardless of how strong the initial referral was. Once you’ve identified where your competitors are outperforming you, create in-depth topic clusters around your specialties, and build authoritative content that answers the questions prospects ask AI tools. 

4. Friction gaps

Friction gaps occur after trust has already been established, but conversion still hasn’t happened. Common examples include:

  • No obvious next step above the fold.
  • Forms that are difficult to complete on mobile.
  • No immediate way to call, text, or book.

At this stage, prospects are ready to act. But any delay introduces doubt and gives them time to reconsider or move on. You’ve earned the referral. Your site validated your expertise. The prospect is ready to hire you — but can’t quickly figure out how to take the next step.

This is the final failure point in the referral validation gap: when a motivated, pre-sold prospect abandons because the conversion path is unclear, inconvenient, or unnecessarily complicated. You need to remove every obstacle between “I want to hire this firm” and “I’ve made contact.”

What to do about it

A referred prospect should be able to answer these questions within three seconds of landing on any page:

  • How do I contact this firm right now?
  • What happens when I do?
  • Is this going to be easy or painful?

Test it yourself: open your site on your phone and start a timer. Can you initiate contact within a few seconds without scrolling? Try it from a homepage, attorney bio, and practice area page. If the answer is no, you’re losing prospects at the finish line.

Get the newsletter search marketers rely on.


Your roadmap to close the referral validation gap

Closing the referral validation gap doesn’t require a complete digital overhaul on day one. Strategic, phased implementation will allow you to see quick wins while building toward comprehensive optimization. Let’s look at the steps you can take.

Quick wins: Remove immediate friction

These are some changes that require minimal investment but can immediately reduce referral abandonment:

  • Adding a prominent click-to-call button in mobile header (and ensuring that it’s visible without scrolling).
  • Testing form completion on mobile devices and reducing any fields to essential only.
  • Ensuring page load speed under two seconds on mobile (test via PageSpeed Insights).
  • Verifying that “Contact Us” is visible on every page without scrolling.
  • Adding a secondary CTA option (for example, many prospects prefer “Schedule Consultation” over “Contact”).
  • Testing that your firm’s phone number is clickable on mobile across entire site.

Medium-term: Build validation infrastructure

These initiatives can require more investment but, over time, can generate a sustainable competitive advantage:

  • Creating dedicated landing pages for each significant practice area.
  • Structuring each page with: a specific H1 tag, a detailed service description, any relevant credentials, relevant case results, an FAQ section, and a clear CTA.
  • Implementing schema markup (e.g., LegalService, Attorney, and FAQPage) on each landing page.
  • Building out an internal linking strategy that guides visitors from homepage → specific expertise → proof points in two clicks maximum.
  • Developing 3-5 detailed case studies per practice area (these can be anonymized where required).
  • Writing blog posts that address the specific questions prospects ask during the research phase.
  • Ensuring all content includes author attribution with credentials to build E-E-A-T signals.

Long-term: Dominate AI search validation

These strategic initiatives can position your firm for sustained advantage in an AI-driven search environment:

  • Creating entity-based content that AI models can parse and cite (e.g., detailed attorney bios, practice area guides, or legal topic explanations).
  • Developing topic clusters: pillar pages for major practice areas with supporting cluster content that addresses related queries.
  • Optimizing content for the natural language queries that prospects ask AI tools.
  • Building citation-worthy resources such as comprehensive guides, state-specific legal explanations, and process walkthroughs.
  • Identifying 15-20 high-value queries prospects use to validate referrals.
  • Monitoring how your firm appears in ChatGPT, Perplexity, and Google AI Overview responses monthly.
  • Tracking competitor mentions and citation patterns.
  • Adjusting content strategy based on AI search visibility gaps.

But, most importantly, don’t let this roadmap overwhelm you. The firms that successfully close the referral validation gap don’t do it by accomplishing everything all at once. Instead, they start with a single, crucial decision: acknowledging that the gap exists. And then they take the first step to fix it.

Once you accept that your best leads are researching you — on your website and through AI tools — and making judgments based on what they find (or don’t find), your path forward for fixing that gap will become clear.

2026 is your firm’s inflection point

Prospects are getting their answers without ever visiting your website. The gap between digital presence and digital authority is widening — and for firms that wait, it becomes unbridgeable.

Closing the referral validation gap isn’t just about improving conversion rates. It means:

  • Capitalizing on your highest-value leads.
  • Reducing customer acquisition costs.
  • Building a compounding advantage.
  • Creating momentum in an AI-driven search environment.

Firms that master this will pull ahead. Those that don’t will watch their best leads slip away — one validation failure at a time.

A referral gets you consideration. Your digital presence determines what happens next. Closing the referral validation gap turns trust into conversion.

❌
❌