Jump to content

Product Page Crawl/Index blocked by Robots.txt

Recommended Posts

  • Replies 3
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Posted Images

Hi – I suspect these product URLs might be getting blocked because they are in the .../search-all/... directory, and .../search is blocked by default in the Squarespace robots.txt file. Obviously the intention there is to block indexing of URLs generated by the built-in website search functionality, but I suspect your products are getting caught up in that too.

I'm not aware of a way to modify the robots.txt file contents on Squarespace, so if that was the case and it's important for your business to have these product URLs indexed by Google, you might need to adjust your URL structure. If you need support there, let me know.

Hope that helps!

- Ed

--

Blue Hills Digital
Support with website optimization, SEO and finding a digital marketing strategy that works.

Latest resource: Squarespace SEO Services

Link to comment

Thanks for the reply. I believe the "/search" in the robots.txt file refers to the actually search function that squarespace provides. Since it doesn't have any indicators that there can be any leading or following characters. As well, its only happening with some of the products, not all, some of the products with  "/search-all" in the URL are just fine...

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

×
×
  • Create New...

Squarespace Webinars

Free online sessions where you’ll learn the basics and refine your Squarespace skills.

Hire a Designer

Stand out online with the help of an experienced designer or developer.