Robots.txt Tester & Validator
Paste your robots.txt content, pick a user-agent, and test if a URL path is allowed or blocked. Includes AI crawler support.
Paste your robots.txt content, pick a user-agent, and test if a URL path is allowed or blocked. Includes AI crawler support.
robots.txt must be at the root domain: example.com/robots.txt. Subdirectory files are ignored by crawlers.
Use specific User-agent directives for GPTBot, ClaudeBot, and others to prevent AI training on your content.
Always add a Sitemap directive pointing to your XML sitemap. This helps search engines discover your pages.
Use Crawl-delay to slow aggressive bots. Note: Googlebot ignores Crawl-delay β use Search Console instead.
LogBeast identifies GPTBot, ClaudeBot, and 100+ bot signatures in your server logs.
Explore LogBeast β