Ghady Posted January 1, 2020 Share Posted January 1, 2020 Hello, It's been a while since I've been facing this issue and there seems to be no solution. I've submitted a sitemap of my website to Google Search Console which was successful, but every time I request Google to index my website, it always ends up with a warning "Indexed, though blocked by robots.txt" meaning the site isn't indexed by Google. (My site is https://mastercarsreview.com/) Did anyone face such problem? How did you fix it? Thank you so much in advance. I really need help in this. (I've sent several emails to both squarespace and google, nothing changed) Link to comment
Solution paul2009 Posted January 1, 2020 Solution Share Posted January 1, 2020 This is completely normal, and you can ignore the message. Your site has been indexed by Google. Squarespace use a robots.txt file to ask Google not to crawl certain pages because they’re for internal use only or display duplicate content. For example, you would not want them to index the /config/ url that you use to administer your website. For more detailed information see Understanding Google SEO emails and console errors. Ghady, christyprice, djm and 2 others 5 Improve your online store with our extensions.About: Squarespace Circle Leader since 2017. I value honesty, transparency, appreciation and great design ♥.Work: Squarespace Developer and founder of SF Digital, building the features Squarespace didn't include™. Content: Links in my posts may refer to SF Digital products or may be affiliate links.Buy me a coffee Link to comment
JohnVoung Posted January 16, 2020 Share Posted January 16, 2020 Just leave them because that’s not going to hurt your site in any way. In this case, what new GSC is reporting is what’s happening with your site and giving you some information to improve your site. They are blocked by the robots.txt file and the Google bot respects robots.txt and does not crawl those pages but they will be indexed in a scenario when those URLs could be linked to from other pages on your site. In this situation Google will index if they are linked to from external url source as well. Better to click the "Fix Coverage issues". Ghady 1 Link to comment
ArneEvertsson Posted December 20, 2020 Share Posted December 20, 2020 Why doesn't Squarespace make sure those pages are not indexed? That would remove the error message. amberwavescreative and kinnoda 2 Link to comment
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment