You typed your company name into ChatGPT. Or worse: you asked for a recommendation in your industry. And your site appeared nowhere. Your competitors did.

This is not a bug. It is not about ad spend either. If ChatGPT, Gemini, or Perplexity do not cite your site, it is because your content does not meet the criteria these engines use to select their sources.

The good news: it is fixable. And often faster than you think. Here are the 6 most common reasons — and how to solve them.

The test you should do right now

Before going further, run this 5-minute diagnostic. Open ChatGPT, Gemini, and Perplexity, then ask them these three questions:

  • The recommendation question: "What is the best [your service/product] in [your city/the US]?"
  • The expertise question: "How do I choose a [your service/product]?"
  • The direct question: "What does [your company name] do?"

If your site does not appear in any of the answers, you have an AI visibility problem. If your competitors appear and you do not, the problem is urgent.

Note the names that come up in the answers. These are your GEO competitors — and they are not necessarily the same as your SEO competitors.

Reason #1: your content is not extractable

This is the most common cause. AI engines look for answers that are ready to cite: a clear sentence, a sharp definition, a self-contained paragraph they can embed directly in their response.

If your site starts with "Welcome to [company], leaders in innovation since 2005...", the AI has nothing to extract. It moves on to the next source.

The typical problem:

  • Homepages that are 100% promotional with no informational content
  • Long introductions before getting to the point
  • Vague text with no definitions or concrete information
  • Content focused on "we" instead of answering customer questions

How to fix it:

Take your 5 most important pages. For each one, make sure the first 100 words contain a direct answer to the question your visitor is asking.

Example for an accounting firm:

❌ "Our firm has been passionately serving clients for over 20 years with all their financial needs."

✅ "A CPA in New York helps small and mid-size businesses with accounting, tax filings, and financial planning. Here's how to choose the right one."

The second version is the one AI will cite.

Reason #2: you are blocking AI bots without knowing it

This is the simplest problem to diagnose — and the most absurd when you discover it. Many sites block AI crawlers in their robots.txt file without even realizing it.

Check now: go to your-website.com/robots.txt and look for these lines:

User-agent: GPTBot
Disallow: /

User-agent: ClaudeBot
Disallow: /

User-agent: PerplexityBot
Disallow: /

If these lines are present, you are explicitly forbidding ChatGPT, Claude, and Perplexity from accessing your content. They simply cannot read you.

How to fix it:

Remove these restrictions from your robots.txt. If you use WordPress, check that your SEO plugin (Yoast, Rank Math) has not added these blocks automatically.

Go further by creating an llms.txt file at the root of your site. This file, specifically designed for AI crawlers, tells them which pages are most important and how your site is structured.

Full technical guide in llms.txt, robots.txt, and AI crawlability.

Reason #3: no structured data

Structured data (Schema.org in JSON-LD) is the language that engines understand without ambiguity. Without it, the AI must guess what your page contains: are you a company or a blog? An article or a product page? A restaurant or a software tool?

AI engines prefer sites that make their job easier. A site with properly implemented structured data has a significant advantage over one without markup.

Check now: go to Google's Rich Results Test and enter your URL. If the result shows "No rich results detected," you have no structured data.

How to fix it:

Implement these three schemas as a priority:

  • Organization on your homepage — states who you are, your logo, contact info, social profiles.
  • FAQPage on pages that contain Q&A — this is the most natural format for AI engines.
  • Article on your editorial content — specifies the author, date, and topic.

JSON-LD format is added in your page source code without modifying the visible content.

Step-by-step implementation in Schema.org and AI: the practical guide to being understood by LLMs.

Reason #4: no verifiable evidence

AI engines are designed to avoid propagating unverified information. If your site makes claims without proving them, it will be judged less reliable than a competitor that sources its assertions.

The typical problem:

  • "We have hundreds of satisfied clients" — how many exactly?
  • "Our solution improves productivity" — by how much, measured how?
  • "Experts recommend our approach" — which experts, in what publication?

How to fix it:

Review your key pages and replace every vague claim with concrete evidence:

  • Precise numbers: "312 companies served since 2019" rather than "hundreds of clients"
  • Named sources: "according to a 2025 McKinsey study" rather than "according to experts"
  • Measurable results: "+34% organic traffic in 4 months for [client]" rather than "significant results"
  • Dates: "updated March 2026" rather than "regularly updated"

Every verifiable data point is a trust signal for AI engines.

Reason #5: your site does not exist outside of itself

AI engines cross-reference sources. When they evaluate a site's credibility, they look for external confirmation: is your brand mentioned elsewhere on the web? Do other sites cite you as a reference?

If your company only exists on its own website — no press mentions, no forum profiles, no citations in third-party articles — the AI has no way to confirm your expertise.

How to fix it:

Build your external presence methodically:

  • Reddit — it is the #1 platform cited by LLMs in 2026. Identify 5-10 relevant subreddits for your industry. Answer questions with expertise and authenticity, without direct promotion. A single useful, well-sourced comment can be picked up by an AI.
  • Guest posts and industry press — pitch expert articles to trade publications in your field. Every article published on a third-party site that mentions your brand strengthens your authority in the eyes of AI.
  • Professional directories — register on recognized directories in your industry. These structured mentions are easily identifiable by AI crawlers.
  • Google Business Profile — even if you do not have a physical storefront, a verified profile with reviews strengthens your online credibility.

The goal is not to be everywhere, but to have credible, consistent mentions on sources that AI engines recognize.

Reason #6: your content is outdated

AI engines favor recent content. If your last blog post is from 2023, if your pages mention "2024 trends," and if no update date is visible, the AI will consider your information unreliable.

This is an especially common problem for businesses that invested in content a few years ago and never updated it.

How to fix it:

  • Identify your high-potential pages — which pages drive traffic or cover topics critical to your business?
  • Update them — refresh the numbers, dates, and examples. Remove outdated references.
  • Display the update date — add a visible "Last updated: [date]" on each refreshed page.
  • Set up a schedule — update pillar pages quarterly, important articles monthly.

A 2023 article updated in March 2026 with fresh data will be treated as recent content by AI engines.

The difference vs. "asking ChatGPT if my site is well optimized"

You might think: "Why not just ask ChatGPT if my site is well optimized?"

The answer is simple: ChatGPT does not see your site the way a technical audit does. When you ask it the question, it gives you a generic opinion based on what it knows (or thinks it knows) about your URL. It does not scan the actual DOM, does not check your robots.txt, does not count your Schema.org tags, and does not measure the extractability of your content.

An automated GEO audit does the opposite: it analyzes your page's actual source code, objectively measures each citation-worthiness criterion, and gives you a quantified score with recommendations prioritized by impact.

It is the difference between asking a friend "does my resume look good?" and having it reviewed by a recruiter with a scoring rubric.

7-day action plan

If you want results fast, here is a day-by-day plan:

  1. Day 1 — Diagnostic. Run the 3-question test (recommendation, expertise, direct name) on ChatGPT, Gemini, and Perplexity. Run a GEO audit on Detekia. Record your baseline score.
  2. Day 2 — Crawlability. Check your robots.txt. Remove AI bot blocks. Create your llms.txt file.
  3. Day 3 — Extractability. Rewrite the first 100 words of your 5 most important pages. Add a direct answer at the top of each page.
  4. Day 4 — Structured data. Implement Organization on the homepage, FAQPage on your FAQ page, Article on your editorial content.
  5. Day 5 — Evidence. Review your key pages. Replace every vague claim with a number, a source, or a concrete example.
  6. Day 6 — Freshness. Update your 3 most important pieces of content. Add visible update dates.
  7. Day 7 — External presence. Create a Reddit account and answer 3 relevant questions in your industry. Verify your Google Business Profile.

After 7 days, run the audit again. You should see a measurable improvement in your score. The effects on AI citations will manifest in the following weeks.

Frequently asked questions

Does paying for Google Ads help get cited by AI?

No. AI citations are independent of advertising. AI engines select their sources based on content quality criteria, not ad budget. This is actually an opportunity: a small site with excellent content can be cited as much as a large corporation.

My site ranks #1 on Google but is absent from ChatGPT. Why?

Ranking well in SEO is a necessary foundation, but not sufficient. AI engines add extra criteria: content extractability, structured data, AI bot crawlability, neutral tone. Your site can be optimized for Google without being optimized for generative engines.

How long before AI cites me?

Technical fixes (robots.txt, Schema.org) take effect in 2 to 4 weeks. Content improvements (extractability, evidence) are reflected in 4 to 8 weeks. Building external authority (mentions, Reddit) takes 2 to 3 months. The sooner you start, the sooner you will be visible.

Does this work for all industries?

Yes. Whether you are a consultant, e-commerce business, SaaS, contractor, or professional services firm, the citation-worthiness criteria are the same. What changes is the competitiveness: in an industry where few competitors are GEO-optimized, the gains are fast and significant.

Start by measuring

You cannot fix what you do not measure. The first step is knowing where you stand: what your GEO Score is, which criteria are failing, and where to begin.

Analyze your site for free on Detekia — score out of 100, 8 criteria, recommendations prioritized by impact, in under 60 seconds. No signup required.