Robots.txt Tester
Test any URL against any crawler.
Paste your robots.txt and a list of URLs. Pick a user-agent — Googlebot, GPTBot, ClaudeBot, PerplexityBot, or any custom bot. See exactly which rule decided each URL.
Results
- Blockedhttps://example.com/admin/dashboardBlocked by rule "Disallow: /admin/" in group "*".
- Allowedhttps://example.com/admin/public/loginAllowed by rule "Allow: /admin/public/" in group "*".
- Allowedhttps://example.com/blog/postNo rule in group "*" matched — default allow.
How to use
How to use the Robots.txt Tester.
- Step 1Paste your robots.txt
Either paste manually, or click 'Fetch live robots.txt' to pull it from the first URL's origin.
- Step 2Add URLs to test
One URL per line. The tool tests all of them in bulk against the same user-agent.
- Step 3Pick a user-agent
Choose Googlebot, GPTBot, PerplexityBot, ClaudeBot, or any of 17+ presets — or paste a custom UA string.
- Step 4Read the verdict
Each URL shows ALLOWED or BLOCKED, plus the exact rule line and user-agent group that decided it.
FAQ
Questions about the Robots.txt Tester.
Google evaluates all matching rules in the most-specific user-agent group, then picks the one with the longest pattern. On ties, Allow beats Disallow. This tool implements the same algorithm.
