Robots.txt Unreachable in GSC: What It Means
If URL Inspection says robots.txt unreachable, do not panic. Check live robots, DNS, CDN, server errors and Search Console robots cache.
Search Console troubleshooting
Robots.txt unreachable is alarming because it can make the live URL test fail on every page. But it does not automatically mean every page is broken. It means Google could not fetch the one file it checks before deciding whether crawling is allowed.
Short answer
Robots.txt unreachable means Google could not access /robots.txt during the test or crawl. Because Google cannot confirm crawl rules, it may pause crawling. Check the robots file, HTTP status, DNS, CDN, firewall, SSL, and Search Console robots cache before editing pages.
Real questions this answers
- Why does URL Inspection say robots.txt unreachable?
- Why does live test fail on every page?
- Should I change my pages when robots.txt is unreachable?
- How do I know if this is a real robots problem or a temporary GSC issue?
Most likely causes
- The robots.txt URL does not return 200 OK consistently.
- DNS or SSL problems affect Googlebot or the Inspection Tool from some locations.
- A CDN, firewall, bot rule, middleware, or hosting edge blocks Googlebot.
- The file was temporarily unavailable and Search Console has not refreshed the robots status yet.
- A redirect chain or invalid response prevents Google from reading the file quickly.
What to fix first
- Open https://yourdomain.com/robots.txt and confirm it returns 200 OK with plain text.
- Test with Googlebot-like user agents and from more than one network if possible.
- Keep robots.txt simple: one User-Agent: * group is often enough for public sites.
- Exclude robots.txt, sitemap.xml, and llms.txt from auth middleware, redirects, and private route logic.
- Use Search Console robots.txt report to request a recrawl, then retry Live Test later.
Tools to use
- curl -I https://yourdomain.com/robots.txt for status and headers.
- Search Console robots.txt report for Google cached fetch status.
- VisRank scan to confirm crawlability, robots, sitemap, and noindex signals.
Credible sources
These are the reference docs used to keep the guidance grounded instead of guessing:
Next useful steps
Related articles
Check your website's SEO & AEO score
Free 30-second scan — no signup required.
Scan my website for free