Reading view

Google PMax gets new exclusions, expanded reporting features

Ads control dashboard

Google is launching new Performance Max controls and reporting: audience exclusions, expanded reporting, and budget forecasting tools.

What’s new. Google announced a mix of “steering updates” and “actionable insights” for PMax:

  • First-party audience exclusions: You can exclude customer lists to shift spend toward net-new customer acquisition instead of repeat conversions.
  • Budget reporting: A new in-platform report projects end-of-month spend and shows how daily budget changes impact performance.
  • Full audience reporting: You get detailed breakdowns by demographics, including age and gender.
  • Network segmentation: You can segment placement reports by network, now under When and where ads showed.

Why we care. These updates help address concerns about PMax’s lack of control and transparency. Exclusions help you avoid wasting spend on existing customers, while improved reporting gives you clearer signals for optimization, budgeting, and brand safety decisions.

Google’s announcement. New Performance Max steering and reporting updates coming in 2026

Automated traffic is growing 8x faster than human traffic: Report

Human vs AI traffic

Automated traffic grew 23.5% year over year in 2025 — about eight times faster than human traffic, which rose 3.1%, according to HUMAN Security’s State of AI Traffic report.

  • AI-driven traffic appears to be a major contributor to that growth, with average monthly volume increasing 187% year over year, while traffic from AI agents and agentic browsers (e.g., OpenAI’s Atlas, Perplexity’s Comet) grew nearly 8,000% year over year.
  • Automated traffic is defined in the report as: “All internet traffic generated by software systems rather than human users, including traditional automation such as search engine crawlers, monitoring bots, and conventional scraping tools, as well as AI-driven traffic.”
  • This report follows Cloudflare CEO Matthew Prince’s prediction that bots could overtake human web usage by 2027.

Why we care. Search is increasingly shaped by more than human queries, crawling, and indexing. AI agents now participate in discovery, comparison, and transactions — within Google’s evolving results and across AI-driven interfaces.

The details. HUMAN groups AI-driven traffic into three broad categories:

  • Training crawlers collecting data for models. They still dominate at 67.5% of AI traffic, but their share is declining as scrapers and agents scale.
  • Real-time scrapers that feed AI search and answers. Scraper traffic grew nearly 600% in 2025, driven by AI-powered search and real-time answer engines.
  • Agentic AI systems that execute tasks autonomously. Smaller in share, but growing fastest and most disruptive.

AI agents behave more like users. These systems aren’t limited to reading content. They increasingly navigate funnels, log in, and transact. In 2025:

  • 77% of observed agent activity (requests) occurred on product and search pages.
  • Nearly 9% touched account-level interactions.
  • More than 2% reached checkout flows.

About the data. HUMAN analyzed more than one quadrillion interactions (requests/events) across its customer base in 2025, with aggregated, anonymized data from 2022 to 2025. It classified AI-driven traffic into training crawlers, AI scrapers, and agentic AI using user-agent strings, infrastructure signals, and observed behavior, noting limits in self-declared bot identity, which may undercount or misclassify some AI-driven activity.

Bottom line. Traffic is becoming less purely human, and discovery is no longer confined to search engines. Optimization now means deciding which machines can access, interpret, and act on your content.

The report. The 2026 State of AI Traffic & Cyberthreat Benchmark Report

Google-Agent user agent identifies AI agent traffic in server logs

Google-Agent

Google introduced a new user agent, called Google-Agent, that signals when AI agents act on users’ behalf, marking an early shift toward agent-driven web interactions.

What happened. Google added Google-Agent to its list of user-triggered fetchers on March 20 and has begun a gradual rollout.

  • The Google-Agent user agent identifies requests made by AI agents running on Google infrastructure, including experimental tools like Project Mariner.

How it works. Google-Agent appears in HTTP requests when an AI agent visits a site to complete a user-initiated task.

  • Example use cases include browsing pages, evaluating content, or taking actions such as submitting forms.
  • This differs from Googlebot and other crawlers, which run continuously in the background without direct user prompts.

IP ranges. Google shared the IP ranges for its desktop agent:

Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; Google-Agent; +https://developers.google.com/crawling/docs/crawlers-fetchers/google-agent) Chrome/W.X.Y.Z Safari/537.36

And the IP ranges for its mobile agent:

Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/W.X.Y.Z Mobile Safari/537.36 (compatible; Google-Agent; +https://developers.google.com/crawling/docs/crawlers-fetchers/google-agent)

Why we care. This lets you identify agent-driven traffic in server logs. You can now distinguish traditional crawl activity from visits triggered by real users through AI agents. That should help you track agent-assisted conversions, understand emerging user behavior, and prepare for agentic search.

What they’re saying. According to Google’s announcement:

  • “The Google-Agent user agent is rolling out over the next few weeks, and will be used by Google agents hosted on Google infrastructure to navigate the web and perform actions upon user request.”

What to watch. Early volumes will be low as the rollout continues, but now is the time to establish a baseline. What to do:

  • Monitor logs for Google-Agent activity.
  • Make sure CDNs and WAFs aren’t blocking the published IP ranges.
  • Validate that key site actions, including forms and flows, work for automated agents.

Dig deeper. Google’s releasing Google-Agent: Here’s what to know

SMX Now: Learn how brands must adapt for AI-driven search

AI Search Picks Winners Here's the GEO Strategy Behind It

Visibility is no longer just about ranking. It depends on whether your content is discovered, evaluated, and selected in AI-driven search experiences.

We’re kicking off our new monthly SMX Now webinar series on April 1 at 1 p.m. ET with iPullRank’s Zach Chahalis, Patrick Schofield, and Garrett Sussman on how you must adapt.

The session introduces iPullRank’s Relevance Engineering (r19g) framework for executing Generative Engine Optimization (GEO) through an omnichannel content strategy. You’ll learn how AI search uses query fan-outs to discover and select sources, and how to structure content so it’s retrieved, surfaced, and cited.

It also emphasizes that GEO success isn’t universal. It requires testing, tailored strategies, and a three-tier measurement model spanning discovery, selection, and citation impact.

Save your spot

Search Engine Land is proud to be a media partner for iPullRank’s upcoming SEO Week event.

Report: Clickout Media turned news sites into AI gambling hubs

Parasite SEO

A company called Clickout Media is being called out for buying trusted news and niche sites, replacing them with AI-generated gambling content, and abandoning them after Google penalties. Some call this “parasite SEO,” but to me it sounds more like large-scale search spam.

What’s happening. The company acquired sports, gaming, and tech sites, then rapidly shifted them from editorial coverage to casino and crypto content, PressGazette reported.

  • Sites were stripped of original reporting, filled with AI-written articles, and used to push offshore gambling links, according to former employees.

How it works. The strategy relies on buying domains with existing authority, then exploiting their ability to rank in Google. Content typically followed a pattern:

  • Legitimate coverage continues briefly to preserve credibility
  • Gambling content is introduced and scaled
  • AI-generated articles and fake author profiles replace human writers
  • Revenue comes from affiliate deals with casino operators, sometimes tied to player losses

The impact. Several previously active publications now appear deindexed, with layoffs and closures following. In some cases, even charity websites were repurposed to host gambling content.

What they’re saying. Google prohibits publishing content at scale for the primary purpose of manipulating rankings. It refers to extreme cases like this as “site reputation abuse,” a violation that can trigger manual actions and removal from Google’s index and search results.

  • “While we aren’t able to comment on a specific site’s ranking on Search, our policies prohibit publishing content at scale for the primary purpose of manipulating search rankings,” Google said about this case.

Why we care. This isn’t SEO in any meaningful sense. It’s reputation abuse designed to game rankings at scale.

The report. The SEO parasites buying, exploiting and ultimately killing online newsbrands by Rob Waugh at PressGazette.

Google updates structured data for forum and Q&A content

Q&A forum content cards

Google expanded its structured data support for forum and Q&A pages, adding properties that help you signal reply threads, quoted content, and whether content is human- or machine-generated. The update aims to reduce how Google misreads discussion and Q&A content.

What changed. Google’s QAPage docs now support commentCount and digitalSourceType. DiscussionForumPosting docs now support sharedContent plus the same commentCount and digitalSourceType.

The details. In Q&A markup, you can use commentCount on questions, answers, and comments to show total comments even if not fully marked up. answerCount + commentCount should equal total replies of any type.

How it works. digitalSourceType lets you flag whether content comes from a trained model or simpler automation. Use TrainedAlgorithmicMediaDigitalSource for LLM-style output and AlgorithmicMediaDigitalSource for simpler bots. If omitted, Google assumes human-generated content.

What’s new for forums. sharedContent lets you mark the primary item shared in a post. Google accepts WebPage, ImageObject, VideoObject, and referenced DiscussionForumPosting or Comment, including quotes or reposts.

Why we care. This gives you more precise control over how Google reads modern community content — especially forum-heavy sites, support communities, UGC platforms, and Q&A sections. Google can better distinguish answers from comments, count partial threads across pagination, and identify when a post mainly shares a link, image, video, or quoted reply.

The documentation. It was updated March 24.

AI citations favor listicles, articles, product pages: Study

AI citation engine

AI search citations favor a small set of formats. Listicles, articles, and product pages drive over half of all mentions across major LLMs, according to new Wix Studio AI Search Lab research analyzing 75,000 AI answers and more than 1 million citations across ChatGPT, Google AI Mode, and Perplexity.

The findings. Listicles led at 21.9% of citations, followed by articles (16.7%) and product pages (13.7%). Together, these three formats made up 52% of all AI citations.

  • Articles dominated informational queries, cited 2.7x more than other formats.
  • Listicles captured 40% of commercial-intent citations, nearly double any other type.

Why intent wins. Query intent — not industry or model — most strongly predicts which content gets cited. This pattern held across industries, from SaaS to health.

  • Informational queries skewed heavily toward articles (45.5%) and listicles (21.7%).
  • Commercial queries were led by listicles (40.9%).
  • Transactional and navigational queries favored product and category pages (around 40% combined).

Why we care. This research indicates that you want to map content types to user goals rather than just creating more content. Articles educate, listicles drive comparison, and product pages convert. Aligning content format with user intent could help you capture more AI citations and increase visibility.

Not all listicles perform equally. Third-party listicles accounted for 80.9% of citations in professional services, compared to 19.1% for self-promotional lists. That seems to indicate LLMs prefer neutral, editorial comparisons over brand-led rankings.

Model differences. All models favored listicles, but diverged after that.

  • ChatGPT leaned heavily into articles and informational content.
  • Google AI Mode showed the most balanced distribution.
  • Perplexity stood out, with 17% of citations coming from discussions like Reddit and forums.

Industry patterns. Content preferences shifted slightly by vertical:

  • SaaS and professional services over-indexed on listicles.
  • Health favored authoritative articles.
  • Ecommerce spread citations across listicles, articles, and category pages.
  • Home repair showed the most even distribution across formats.

The research. The content types most cited by LLMs

❌