Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Advanced Search

Indexed though blocked by robots.txt

Recommended Posts


It's been a while since I've been facing this issue and there seems to be no solution. I've submitted a sitemap of my website to Google Search Console which was successful, but every time I request Google to index my website, it always ends up with a warning "Indexed, though blocked by robots.txt" meaning the site isn't indexed by Google. (My site is https://mastercarsreview.com/)

Did anyone face such problem? How did you fix it?

Thank you so much in advance. I really need help in this. (I've sent several emails to both squarespace and google, nothing changed)

Share this post

Link to post

This is completely normal, and you can ignore the message. Your site has been indexed by Google.

Squarespace use a robots.txt file to ask Google not to crawl certain pages because they’re for internal use only or display duplicate content. For example, you would not want them to index the /config/ url that you use to administer your website.

For more detailed information see Understanding Google SEO emails and console errors.

World-class Squarespace Developer and Squarespace Circle Leader with a strong reputation on the Squarespace Forum.
I'm trusted by hundreds of Squarespace users worldwide, including designers, freelancers, business owners and big agencies.

I can tackle anything from small tweaks to full websites and I'm available for short and long bookings (min. 1 hour).
Book via my website giving as much notice as possible. Note that I'm currently booked until the end of May 2020.
Prebuilt Squarespace Extensions for Squarespace 7.0 and 7.1: Enquiry Form ExtensionDate Picker Extension and Age Verification Extension 
Custom Squarespace Extensions: Tell me about the functionality you need

**NEW** Our popular Wishlist extension is now Squarespace 7.1 compatible.

Share this post

Link to post

Just leave them because that’s not going to hurt your site in any way. In this case, what new GSC is reporting is what’s happening with your site and giving you some information to improve your site. They are blocked by the robots.txt file and the Google bot respects robots.txt and does not crawl those pages but they will be indexed in a scenario when those URLs could be linked to from other pages on your site. In this situation Google will index if they are linked to from external url source as well. Better to click the "Fix Coverage issues".

Share this post

Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

  • Create New...