From db8e07eb52addbe498ca51f78d7e85edced614cb Mon Sep 17 00:00:00 2001 From: "mintlify[bot]" <109931778+mintlify[bot]@users.noreply.github.com> Date: Mon, 13 Apr 2026 21:54:35 +0000 Subject: [PATCH] Document robots.txt AI agent access for GEO and SEO pages Generated-By: mintlify-agent --- guides/geo.mdx | 30 ++++++++++++++++++++++++++++++ optimize/seo.mdx | 4 ++++ 2 files changed, 34 insertions(+) diff --git a/guides/geo.mdx b/guides/geo.mdx index ad4faf682..4702570ae 100644 --- a/guides/geo.mdx +++ b/guides/geo.mdx @@ -139,6 +139,36 @@ Mintlify automatically generates an `llms.txt` file for your documentation. LLMs You can view your LLMs.txt by appending `/llms.txt` to your documentation URL. +### Allow AI agents in robots.txt + +Your `robots.txt` file controls which bots can crawl your site. If it blocks AI user agents, tools like ChatGPT, Claude, and Perplexity cannot read your documentation and will not cite it in answers. + +Mintlify's auto-generated `robots.txt` allows all crawlers by default. If you use a [custom robots.txt file](/optimize/seo#custom-sitemaps-and-robotstxt-files), make sure it does not block AI agents. The most common AI user agents include: + +- `GPTBot`, `OAI-SearchBot`, `ChatGPT-User` (OpenAI) +- `ClaudeBot`, `Claude-User` (Anthropic) +- `PerplexityBot` (Perplexity) +- `Google-Extended` (Gemini) + +A robots.txt that blocks all crawlers also blocks AI agents: + +```txt Bad — blocks all AI agents +User-agent: * +Disallow: / +``` + +To block specific scrapers while allowing AI agents, target only the bots you want to restrict: + +```txt Good — allows AI agents +User-agent: BadBot +Disallow: / + +User-agent: * +Disallow: /private/ +``` + +Run [`mint score`](/cli/commands#mint-score) to check whether your site's `robots.txt` allows AI agents. The `robotsTxtAllowsAI` check passes when no AI user agents are blocked. + ## Test how AI tools represent your docs Regularly test whether AI tools are citing your documentation accurately. diff --git a/optimize/seo.mdx b/optimize/seo.mdx index 799ed8f5d..8d1ddddb5 100644 --- a/optimize/seo.mdx +++ b/optimize/seo.mdx @@ -316,6 +316,10 @@ To include hidden pages in search indexing, add `seo.indexing` to your `docs.jso To add a custom `sitemap.xml` or `robots.txt` file, create a `sitemap.xml` or `robots.txt` file at the root of your project. Adding a custom file overrides the automatically generated file of the same name. If you delete a custom file, the default file automatically applies again. + + If your custom `robots.txt` blocks AI user agents like `GPTBot`, `ClaudeBot`, or `PerplexityBot`, AI tools cannot crawl your documentation and will not cite it in answers. See [Allow AI agents in robots.txt](/guides/geo#allow-ai-agents-in-robotstxt) for details. + + ## Disable indexing To prevent search engines from indexing a page, add `noindex: true` to the [frontmatter](/organize/pages) of the page.