If you’re seeing the error message “Blocked by robots.txt” in Google Search Console, you’re not alone — and you’re not the only one confused by it. This common yet critical technical SEO issue can prevent search engines like Google from crawling and indexing your website pages properly. And when your pages aren’t indexed, they won’t show up in search results, which means lost traffic, lower rankings, and fewer conversions. How to Choose the Right Technical SEO Agency?
Table of Content
But what exactly does this error mean? Why would Google be blocked from crawling your content? And how can you fix it, especially if you’re running your website on WordPress, Shopify, Blogger, or another platform? Professional Technical SEO Services by Wonbolt.com: Elevate Your Site Today

In this detailed guide, we’ll break down:
Blocked by robots.txt, meaning
“Indexed, though blocked by robots.txt,” explained
Fixing robots.txt errors in WordPress, Shopify, Blogger, etc.
Troubleshooting internal or sitemap robots.txt blocks
Technical SEO checklist for crawlability issues
Editing robots.txt safely without hiding important pages
Whether you’re a site owner, an SEO expert, or someone using tools like Ahrefs, Search Console, or RankMath, this guide will help you understand and fix every angle of this error, including the frustrating “crawl allowed error, robots.txt blocked conflict. From Canonical Tags to Canonically Defined URLs – All You Need to Know
Let’s get started with understanding what robots.txt does and how it can impact your SEO if not configured correctly.
🚫 What Does “Blocked by Robots.txt” Mean?
The robots.txt file tells search engine crawlers which pages or sections of your site should not be crawled. If a page is blocked by robots.txt, search engines will not index it, even if it’s included in your sitemap.
Common keywords:
- What is blocked in robots.txt txt
- What does blocked by robots.txt txt mean
- Page cannot be indexed, blocked by robots.txt
- indexed but blocked by robots.txt
Linux Basics for Hackers: The Ultimate Beginner’s Guide to Ethical Hacking
🔍 Why Is My Website Blocked by Robots.txt?
Here are the most common reasons:
- Your robots.txt file accidentally disallows important URLs.
- A sitemap is listed, but the URLs in it are disallowed (e.g., sitemap blocked through robots.txt)
- You’re using plugins that restrict crawling (common on WordPress)
- Your Shopify theme has default blocks in the robots.txt file
- You’re testing your site and forgot to allow crawling
Also seen as:
- Website blocked by robots.txt
- Internal blocked robots.txt
- blocked robots.txt file

🛠️ How to Fix “Blocked by Robots.txt” – Step-by-Step
Step 1: Check Your robots.txt File
Use the URL yoursite.com/robots.txt
or the robots.txt Tester in Google Search Console.
Look for lines like:
User-agent: *
Disallow: /
This blocks your entire site. Change it to:
User-agent: *
Disallow:
Keywords used: block all robots, robots.txt file
Step 2: Review Sitemap Access
Ensure your sitemap isn’t disallowed. For example:
Sitemap: https://yoursite.com/sitemap.xml
Make sure the sitemap itself and the URLs in it are crawlable.
Keywords used: sitemap blocked by robots.txt
Step 3: Test in Google Search Console
Use the URL Inspection Tool to check if a page is still being blocked.
Fixable errors include:
- Google blocked by robots.txt
- Google Search Console blocked by robots.txt
Step 4: Review Themes and Plugins
WordPress
Go to Settings > Reading, and ensure the “Discourage search engines from indexing this site” box is unchecked.
Also, check SEO plugins (like Yoast) that may block categories, tags, or other post types.
Keywords used:
- WordPress is blocked by robots.txt
- , though blocked by robots.txt WordPress
Shopify
Shopify generates a default robots.txt file. You can override it by creating a robots.txt.liquid
template:
- Go to Online Store > Themes > Edit Code
- Add a new template under Layout >
robots.txt.liquid
- Customize your rules and remove unnecessary blocks
Keywords used:
- blocked by robots.txt Shopify
- Shopify blocked robots.txt

Step 5: Fix “Indexed Though Blocked by Robots.txt”
If your page is indexed but blocked, Google may still show the page in results without a description.
To fix:
- Remove the block in robots.txt
- Or add a
noindex
tag in the page’s meta (if you don’t want it indexed)
Keywords used:
- How to fix the index though blocked by robots.txt
- indexed, though blocked by robots.txt
- indexed, though blocked by robots.txt, blogger
🚀 Want to fix crawlability issues faster? These tools and resources will help you stay productive, safe, and ahead in SEO.
1. Backup Before Editing Robots.txt
💡 Pro Tip: Always back up your website before editing the robots.txt file. A reliable option is the Samsung T7 Portable SSD —fast, compact, and secure. It ensures you can restore your site quickly if anything goes wrong.
2. Learn Technical SEO in Depth
📖 If you’re serious about mastering technical SEO and crawlability issues, the updated The Art of SEO (2024 Edition) is one of the most comprehensive guides available. A great resource for staying ahead in 2026.
3. Build a Productivity-Friendly SEO Setup
🖥️ Handling robots.txt fixes often means juggling Google Search Console, sitemaps, and crawl reports at the same time. An ASUS ProArt Display Monitor makes multitasking easier with sharp visuals and wide workspace.
4. For SEO Mentors & Trainers
🎙️ If you also create SEO tutorials or train teams, clear audio is key. The Blue Yeti USB Microphone delivers professional-quality sound, perfect for webinars, screen recordings, or client sessions.
✅ Technical SEO Checklist for “Blocked by Robots.txt”
Use this checklist to troubleshoot the issue effectively:
Task | WordPress | Shopify |
---|---|---|
Check robots.txt at yourdomain.com/robots.txt | âś… | âś… |
Remove Disallow: / unless intentionally blocking | âś… | âś… |
Ensure sitemap is not disallowed | âś… | âś… |
Use Search Console’s URL Inspection Tool | ✅ | ✅ |
Review SEO plugin settings (Yoast, RankMath) | ✅ | ❌ |
Review theme settings | âś… | âś… |
Create/edit robots.txt.liquid file | ❌ | ✅ |
Remove “Discourage search engines” setting | ✅ | ❌ |
Check Google Index for impacted pages | âś… | âś… |
Add noindex if needed (alternative method) | âś… | âś… |
📌 Related Queries & Edge Cases
- Blogger blocked robots.txt
- crawl allowed error not blocked through robots.txt
- Ahrefs blocked robots.txt
- internal blocked by robots tx
- blocked by robots.txt txt how to fix
- How to solve being blocked by robots.txt
- blocked by the robots.txt file
🔄 Final Thoughts
If your website is blocked by robots.txt, you’re likely missing out on valuable search engine traffic. This is a critical SEO issue—but it’s also one that’s easy to fix when you know where to look.
Whether you’re using WordPress, Shopify, Blogger, or any other CMS, take the time to audit your robots.txt setup. If needed, reach out to your developer or SEO expert to help resolve complex blocking rules.
đź“© For advanced technical SEO help, contact infowonbolt@gmail.com or visit Wonbolt.com to fix your indexing issues today.
Don’t let a robots.txt file block your business growth. Fix it today—get found, get traffic, and grow online!

FAQ: Indexing Issues
Q1: What does “Indexed though blocked robots.txt” mean?
It means Google found the URL (via links or sitemaps) but couldn’t crawl it due to a disallow rule in robots.tx
t. It indexed the URL without seeing its content.
Q2: How to fix “Indexed though blocked robots.txt” in WordPress?
Use Yoast or Rank Math to manage the robots.txt file, unblocking necessary pages. Also, allow indexing in WordPress reading settings.
Q3: How to fix “Blocked robots.txt” in Shopify?
Shopify restricts full robots.txt edits. Use canonical tags and SEO apps to manage indexing. For advanced use, consult Shopify developer tools.
Q4: Why is Googlebot blocked in robots.txt?
Your site’s robots.txt
might unintentionally disallow Googlebot or a specific user agent. Check your file and remove unnecessary disallow directives.
Q5: How to fix the URL blocked robots.txt?
Go to your robots.txt file and remove or modify the line that disallows that URL. Test using Google Search Console’s URL Inspection Tool.
Q6: My sitemap contains URLs blocked through robots.txt. Should I worry?
Yes. A sitemap should list indexable URLs. Remove blocked URLs from your sitemap to avoid confusion and indexing issues.
Q7: Is my site blocked by robots.txt? How can I check?
Visit yourdomain.com/robots.txt
Or use Screaming Frog or the URL Inspection Tool in Google Search Console to check.
Q8: What to do if some important page is blocked by robots.txt?
Remove the disallow rule for that page in the robots.txt file and resubmit the URL in Google Search Console.
Q9: How to fix the “Submitted URL blocked by robots.txt” error?
Make sure the submitted page is allowed to be crawled in your robots.txt. Edit the file accordingly and validate the fix in GSC.
Q10: Can Google index a page even if it’s blocked by robots.txt?
Yes, especially if the page has backlinks. Google may index the URL without content, which affects SEO.
Q11: Can Screaming Frog help detect pages blocked by robots.txt?
Absolutely. Use Screaming Frog’s “Blocked by robots.txt” report to identify and fix crawl issues efficiently.
Q12: How to check if a URL is blocked by robots.txt?
Use Google’s URL Inspection Tool to visit robots.txt or crawl with tools like Screaming Frog to confirm block status. Allow the page to be crawled in robots.txt and re-inspect the URL in GSC. If needed, request reindexing.
Q13: How to fix “Indexed though blocked by robots.txt” in Google Search Console?
To fix the “Indexed though blocked by robots.txt” issue in Google Search Console, follow these steps:
âś… 1. Understand the Message
This warning means Google indexed your page (because it discovered it from external/internal links), but couldn’t crawl it due to restrictions in your robots.txt
file. This can result in incomplete indexing or outdated content.
âś… 2. Identify the Blocked URL
In Search Console:
- Go to Pages → Why pages aren’t indexed
- Look for the status: “Indexed, though blocked by robots.txt”
- Click the URL to inspect details.
âś… 3. Check Your robots.txt
File
Visit:yourdomain.com/robots.txt
Look for any Disallow rules like:
makefileCopyEditUser-agent: *
Disallow: /
or
bashCopyEditDisallow: /example-page/
That might be blocking important pages.
âś… 4. Fix the robots.txt
Rules
Edit your robots.txt
file (in your file manager or theme settings) and remove or modify the Disallow rule.
Example:
Before (blocked):
bashCopyEditDisallow: /blog/
After (allowed):
bashCopyEditAllow: /blog/
âś… 5. Use URL Inspection Tool
Go back to Google Search Console → use the “Inspect URL” tool → Click “Request Indexing” to re-crawl the page.
âś… 6. WordPress-Specific Fixes
- Go to Settings → Reading
- Uncheck the box: “Discourage search engines from indexing this site”
- Use an SEO plugin like Rank Math or Yoast SEO to manage robots.txt
âś… 7. Shopify-Specific Fixes
Shopify restricts editing robots.txt
by default. But you can edit Shopify robots.txt:
- Go to Online Store → Actions → Edit Code
- Create a
robots.txt.liquid
file in the layout folder - Adjust
Disallow
rules carefully
âś… 8. Squarespace-Specific Fix
Squarespace doesn’t let you edit the robots.txt file.
To fix this:
- Remove password protection from pages
- Ensure noindex is not applied in SEO settings
âś… 9. Confirm with Tools
Check if the issue is resolved using:
- Google Search Console – URL Inspection
- Screaming Frog → Crawl → Check for blocked status
- robots.txt tester from Google
âś… 10. Wait for Reindexing
After fixing and submitting the URL, Google may take a few days to reprocess it. Monitor indexing updates via Search Console.
Q14: What if my sitemap is blocked by robots.txt?
Update your robots.txt file to allow search engines to access your sitemap (e.g., remove disallow on /sitemap.xml
).
Let Wonbolt.com handle your robots.txt issues and technical SEO headaches. From auditing to fixing sitemap and crawl errors, our SEO specialists are ready to optimize your site for maximum visibility.
đź“§ Reach out: infowonbolt@gmail.com