Normal view

Yesterday — 11 February 2026Main stream

Google previews WebMCP, a new protocol for AI agent interactions

11 February 2026 at 20:50
AI content crawlers

Google today announced an early preview of WebMCP, a new protocol that defines how AI agents interact with websites.

  • “WebMCP aims to provide a standard way for exposing structured tools, ensuring AI agents can perform actions on your side with increased speed, reliability, and precision,” wrote André Cipriani Bandarra from Google.

WebMCP lets developers tell large language models exactly what each button or link on a website does. WebMCP allows websites to explicitly publish a clear “Tool Contract” that defines available actions.

It runs on a new browser API, navigator.modelContext. Through that API, the website shares a structured list of tools — such as buyTicket(destination, date). The AI can then call those functions directly, making interactions faster, more accurate, and far more reliable.

Structured interactions for the agentic web. WebMCP introduces two new APIs that let browser agents act on a user’s behalf:

  • Declarative API: Handles standard actions defined directly in HTML forms.
  • Imperative API: Supports complex, dynamic interactions that require JavaScript execution.

These APIs act as a bridge, making your website agent-ready. They enable faster, more reliable agent workflows than raw DOM manipulation.

Use cases. Google shared use cases that show how an AI agent can handle complex tasks for your users with speed and confidence:

  • Travel: Users can get the exact flights they want. Agents can search, filter results, and complete bookings using structured data that delivers accurate results every time.
  • Customer support: Users can create detailed support tickets faster. Agents can automatically fill in the required technical details.
  • Ecommerce: Users can shop more efficiently. Agents can find products, configure options, and move through checkout with precision.

How to access the preview. You can apply for the preview to WebMCP here.

Why we care. Agentic experiences are shaping the future of search—and possibly SEO. Dan Petrovic called it the biggest shift in technical SEO since structured data. Glenn Gabe called this a big deal. It’s worth exploring these new protocols now.

Before yesterdayMain stream

Google & Bing don’t recommend separate markdown pages for LLMs

6 February 2026 at 16:24

Representatives from both the Google Search and Bing Search teams are recommending against creating separate markdown (.md) pages for LLM purposes. The purpose is to serve one piece of content to the LLM and another piece of content to your users, which technically may be considered a form of cloaking and against Google’s policies.

The question. Lily Ray asked on Bluesky:

  • “Not sure if you can answer, but starting to hear a lot about creating separate markdown / JSON pages for LLMs and serving those URLs to bots.”

Google’s response. John Mueller from Google responded saying:

  • “I’m not aware of anything in that regard. In my POV, LLMs have trained on – read & parsed – normal web pages since the beginning, it seems a given that they have no problems dealing with HTML. Why would they want to see a page that no user sees? And, if they check for equivalence, why not use HTML?”

Recently, John Mueller also called the idea stupid, saying:

  • “Converting pages to markdown is such a stupid idea. Did you know LLMs can read images? WHY NOT TURN YOUR WHOLE SITE INTO AN IMAGE?” That is of course, converting your whole site to an MD file, which is a bit extreme, to say the least.

I did collect a lot of John Mueller’s comments on this topic, over here.

Bing’s response. Fabrice Canel from Microsoft Bing responded saying:

  • “Lily: really want to double crawl load? We’ll crawl anyway to check similarity. Non-user versions (crawlable AJAX and like) are often neglected, broken. Humans eyes help fixing people and bot-viewed content. We like Schema in pages. AI makes us great at understanding web pages. Less is more in SEO !”

Why we care. Some of us like to look for shortcuts to perform well on search engines and now the new AI search engines and LLMs. Generally, shortcuts, if they work, only work for a limited time. Plus, these shortcuts can have an unexpected negative effect.

As Lily Ray wrote on LinkedIn:

  • “I’ve had concerns the entire time about managing duplicate content and serving different content to crawlers than to humans, which I understand might be useful for AI search but directly violates search engines’ longstanding policies about this (basically cloaking).”
❌
❌