Reading view

Stop chasing Reddit and Wikipedia: What actually drives AI recommendations

Stop chasing Reddit and Wikipedia: What actually drives AI recommendations

We’ve all seen the charts going viral on LinkedIn. They’re everywhere at this point. Multiple industry studies, even this research from Semrush, confirm that Wikipedia and Reddit are the top-cited domains across major LLM platforms — and CMOs are running with this data.

Top cited domains on LLM per Semrush, October 2025

The response is predictable: Just search for any bottom-of-funnel (BOFU) software query, and you’ll find Reddit threads in the top-ranking positions. This is exactly why the market is currently flooded with “Reddit SEO” agencies:

Reddit SEO agencies on Google

Just stop.

Taking this macro context — or a few isolated, high-ranking SERPs — and pivoting your entire GEO strategy toward Reddit or Wikipedia is a massive strategic error for the majority of B2B brands.

Why CMOs are misguided by the Reddit hype

The algorithmic tide is running toward massive community forums and open-source encyclopedias. That shift is real — but how it’s being interpreted isn’t.

The charts driving this executive FOMO are mathematically accurate, but they’re strategically misguided. Applying them as a universal GEO playbook ignores why that aggregate data exists and why certain pages rank for high-intent queries.

Reddit is the primary target because it’s perceived as easier to influence. While the industry respects Wikipedia’s ironclad editorial guardrails, Reddit is often viewed as an open loophole.

This is a classic case of marketing whiplash, where teams abandon foundational principles to chase the shiny new object.

To understand why Reddit and Wikipedia are a high-effort, low-upside channel for the vast majority of brands, you have to look at the context executives ignore.

Macro studies analyze hundreds of thousands of randomized keywords

These studies add up citations across a randomized database that covers everything from pop culture to generalized consumer advice.

As Alex Birkett points out

  • “Wikipedia, Reddit, and YouTube are heavily cited by LLMs because they are massive websites with a topical footprint that spans into a million different areas.” 

By default, they’ll always get the most aggregate LLM citations.

High-ranking Reddit threads on BOFU queries can’t be reproduced

When you see a Reddit thread driving CTR for a specific BOFU software query, it’s tempting to view it as an SEO loophole that can be easily reverse-engineered. This is incorrect.

In reality, this is a scenario where the “voice of the customer” largely dictates who gets recommended.

This isn’t an SEO hack or a growth trick. It’s the culmination of years of actual human peer reviews and real discussion on a topic that has reached a definitive consensus. Your marketing team can’t microwave this historical, multi-year, authentic brand sentiment.

Claiming you need a Reddit or Wikipedia strategy because they are the most-cited domains overall is like claiming spaghetti carbonara is the most-eaten dish in Italy. Yes, it’s ubiquitous and popular, but just because it’s everywhere doesn’t mean you should put it on the menu at a high-end steakhouse. 

Dig deeper: Rand Fishkin proved AI recommendations are inconsistent – here’s why and how to fix it

The illusion of ‘hacking’ Reddit and Wikipedia for AI visibility

Even if you ignore the macro context and decide to aggressively pursue a Reddit or Wikipedia SEO strategy, you’ll quickly realize how LLMs actually process data. 

Hacking them for AI citations is an illusion built on a fundamental misunderstanding of what LLMs are looking for. When you look at the mechanics of AI citations, two massive roadblocks emerge.

Historical consensus can’t be microwaved

Thirsty SEO agencies will frequently pitch Reddit marketing services, promising to generate hundreds of upvotes and comments to trigger LLM visibility. But the data shows LLMs don’t care about manufactured virality.

Up to 80% of Reddit threads cited by AI have fewer than 20 upvotes, according to Semrush. More importantly, the average age of a cited post is roughly 900 days. LLMs are surfacing historical, established consensus, not yesterday’s growth hack. 

Patterns of Reddit posts cited by AI tools - Semrush

Wikipedia editors will just delete you

The exact same brutal reality applies to Wikipedia. A Princeton University study analyzing AI-generated Wikipedia content revealed exactly what happens when marketers try to “hack” the encyclopedia with generative tools. 

Researchers found that when users utilized AI to create self-promotional pages for businesses, the articles were mathematically lower in quality, lacking proper footnotes and internal links.

The result?

Human moderators quickly identified the low-effort content, deleted the pages for “unambiguous advertising,” and actively banned users.

Paraphrasing destroys narrative control

Even if you successfully infiltrate a subreddit or a Wikipedia page without getting banned, you lose control over your product positioning. Benji Hyam notes that Reddit mentions are typically too short and lack the depth necessary for an LLM to associate your product with a specific problem and solution.

The Semrush data also proves this: AI tools don’t quote Reddit word-for-word. They blend and paraphrase discussions (showing a semantic similarity score of just 0.53). 

How closely AI responses mirror Reddit posts

Your carefully crafted value proposition will be mashed up with random, anonymous user comments, or stripped down to dry, encyclopedic neutrality, diluting your brand narrative entirely.

Posting on Reddit isn’t an SEO strategy — it’s shouting through a bus window, hoping to join the conversation. At best, it’s a short-term tactic. At worst, it actively damages your brand.

Get the newsletter search marketers rely on.


The hidden risks of astroturfing

The lack of ROI is only half the problem when it comes to building a Reddit or Wikipedia presence. The much larger issue is the active harm it can inflict on your brand’s image.

Brands that treat these platforms as loopholes for AI citations fundamentally misunderstand their architecture.

As Eli Schwartz points out, trying to replicate decades of genuine human conversation with templated brand messaging isn’t just ineffective — it’s a massive reputational hazard.

Reddit communities are aggressively moderated

Subreddits and wiki pages are policed by passionate human moderators and veteran Wikipedia editors. They’ve seen every variation of corporate infiltration. 

A new account dropping a link, manufacturing enthusiasm, or violating Wikipedia’s strict conflict of interest (COI) guidelines is flagged, reverted, and banned almost immediately. Sometimes, this is accompanied by a public callout (featured on subreddits like r/hailcorporate), causing more brand damage than the campaign was ever worth.

LLMs ingest deleted spam and banned accounts

This is the most critical and misunderstood risk. Reddit sells its data directly to companies like Google and OpenAI. Wikipedia’s entire edit history is completely open source.

LLMs aren’t just scraping the public-facing websites. They’re receiving the entire firehose of data (including deleted posts, reverted wiki edits, and banned accounts). When your agency’s fake comments or promotional product descriptions get removed by moderators, those AI models still see the manipulation.

Astroturfing creates a permanent negative trust signal

Because the AI models have full visibility into the moderation pipeline, links or mentions flagged as inauthentic carry negative weight. By attempting to game the system, you’re essentially training the AI to associate the brand with spam and coordinated manipulation.

Dig deeper: How to build an organic Reddit strategy that drives SEO impact

Where AI actually looks for ground truth

Once you accept that hacking Reddit or Wikipedia is both ineffective and dangerous, you have to look at where LLMs are actually pulling their answers from when a buyer is ready to make a purchase. When you filter for high-intent, BOFU prompts, the “Reddit/Wikipedia is everywhere” narrative falls apart.

Using AI visibility platforms like Scrunch AI exposes Reddit’s and Wikipedia’s true influence on specific target categories. For one B2B client, tracking 300+ custom prompts generated thousands of LLM responses, but just two specific Reddit threads were responsible for the vast majority of citations.

Reddit citations for a B2B client

The Wikipedia data was even more revealing.

For high-intent software queries, the encyclopedia barely registered. When AI tools cited Wikipedia, they were almost exclusively scraping broad, top-of-funnel category definitions, or pulling background facts from a specific company’s history page.

Wikipedia citation for a B2B client

Data from Grow and Convert shows the same thing. For trucking software queries, LLMs consistently cited domains like PCS Software and TruckingOffice.

Trucking software queries - AI citations

For project management queries, the AI cited specialized software review sites and niche blogs.

Project management queries - AI citations

This is a far cry from the overwhelming dominance promoted by SEO/GEO research studies

If you’re chasing platforms simply because they cover massive topical geography, you’re making a painful error. You don’t need to be visible everywhere. You only need to be visible in the specific digital neighborhood that influences your flagship category.

Dig deeper: ‘Search everywhere’ doesn’t mean ‘be everywhere’

How to actually earn AI recommendations: Owned content and niche citations

Winning in AI search requires optimizing for targeted influence rather than aggregate metrics. The most effective GEO strategy abandons massive topical geography and focuses entirely on the pillars you can actually control.

Publish deep, human-written owned content

Your website remains your most powerful asset. To be recommended, you must provide the specific, granular depth the AI needs to understand your value. Your key product and solution pages need to explicitly cover:

  • Who the product is for.
  • How it’s used.
  • The specific pain points it solves.
  • Its core benefits. 

This depth is exactly what gives you a chance at showing up for the highly specific, long-tail queries a customer types into an AI when evaluating products.

Execute targeted citation outreach

Use AI visibility tools to identify the specific, niche domains that currently influence your flagship categories. Once you know which industry blogs, review sites, and peer publications the LLMs are actually citing for your BOFU queries, execute targeted outreach to earn your place on those exact lists.

Dig deeper: How paid, earned, shared, and owned media shape generative search visibility

If you want a Reddit or Wikipedia strategy, respect their ecosystems

Reddit and Wikipedia carry real authority, and earning trust there is valuable independent of AI visibility. If you choose to invest in them, it must be a long-term play, not a marketing hack.

  • Engage authentically on Reddit: Answer questions, provide unique insights, and participate in discussions where your buyers actually hang out. Build street cred before recommending your own tools.
  • Build a branded subreddit for transparency: Create an official space for your team to share expertise, host AMAs, and answer product questions openly.
  • Monitor conversations for product insights: Use the platform to spot emerging pain points and shifts in sentiment before they hit traditional search engines.
  • Leave Wikipedia to the experts: If your brand genuinely deserves a Wikipedia page, it will be created by independent editors using reliable secondary sources. Don’t try to write your own product entry.

The path to AI visibility runs through your own domain and the highly specific digital neighborhoods your buyers trust. AI engines reflect the authority you already have. If you want the algorithm to recommend your brand, then you have to do the work to actually be recommendable.

❌