Freelancer Tamal
Robots.txt Tester

Test any URL against any crawler.

Paste your robots.txt and a list of URLs. Pick a user-agent — Googlebot, GPTBot, ClaudeBot, PerplexityBot, or any custom bot. See exactly which rule decided each URL.

Results
  • Blocked
    https://example.com/admin/dashboard
    Blocked by rule "Disallow: /admin/" in group "*".
  • Allowed
    https://example.com/admin/public/login
    Allowed by rule "Allow: /admin/public/" in group "*".
  • Allowed
    https://example.com/blog/post
    No rule in group "*" matched — default allow.
How to use

How to use the Robots.txt Tester.

  1. Step 1
    Paste your robots.txt

    Either paste manually, or click 'Fetch live robots.txt' to pull it from the first URL's origin.

  2. Step 2
    Add URLs to test

    One URL per line. The tool tests all of them in bulk against the same user-agent.

  3. Step 3
    Pick a user-agent

    Choose Googlebot, GPTBot, PerplexityBot, ClaudeBot, or any of 17+ presets — or paste a custom UA string.

  4. Step 4
    Read the verdict

    Each URL shows ALLOWED or BLOCKED, plus the exact rule line and user-agent group that decided it.

FAQ

Questions about the Robots.txt Tester.

Google evaluates all matching rules in the most-specific user-agent group, then picks the one with the longest pattern. On ties, Allow beats Disallow. This tool implements the same algorithm.

Need a full technical SEO audit?

See services
Free auditBook a call