Robots.txt
Learn how robots.txt controls which AI crawlers can access your website. Configure permissions for GPTBot, Google-Extended, and other AI agents.
What Is Robots.txt?
Robots.txt is a text file placed at the root of your website that tells web crawlers — including AI bots — which pages they can and cannot access. It's the first file AI crawlers check before indexing your content.
For AI visibility, robots.txt has become critical because new AI-specific crawlers (GPTBot for ChatGPT, Google-Extended for Gemini, ClaudeBot for Claude) each respect their own directives. If you're not explicitly managing these, you might be unintentionally blocking AI from discovering your business.
A properly configured robots.txt is the difference between being visible to AI assistants and being completely invisible.
Real Example: AI-Optimized Robots.txt
User-agent: GPTBot Allow: / User-agent: Google-Extended Allow: / User-agent: ClaudeBot Allow: / This configuration explicitly allows major AI crawlers to access all pages. Many websites unknowingly block these bots with blanket 'Disallow' rules, making themselves invisible to AI platforms.
Why It Matters for Your Business
Your robots.txt is the gatekeeper for AI access. A misconfigured file can block your business from appearing in ChatGPT, Google AI Overviews, and other AI platforms — costing you visibility to millions of potential customers.
Frequently Asked Questions
See How Robots.txt Affects Your AI Visibility Score
Run a free CitationIQ™ scan on your business — we check robots.txt and 20+ other AI visibility factors.