Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Advanced Search

Password-protected page showing as a broken internal link


G-man

Recommended Posts

Site URL: https://www.5theis.com/distributor-login

I've got a password-protected page that is getting flagged in SEMrush as 80 broken internal links because of it being in the main menu. The page comes up fine, but is returning a 401 Unauthorized error.

So I'd simply like to add a line to robots.txt to have it ignore that page in its site crawler. Saw a comment in an online forum saying we can't edit your robots.txt. Any suggestion on fixing this issue?

Edited by G-man
Shorten title
Link to comment
  • G-man changed the title to Password-protected page showing as a broken internal link
  • Replies 3
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Yeah, you do have a few options here. This page outlines them pretty well: https://support.squarespace.com/hc/en-us/articles/360022347072-Hiding-pages-from-search-engine-results

The easiest option is to flip the switch on the SEO settings tab of the page to tell it to not be indexed. This option is below the fold on the pop-up screen which is kinda annoying and may be why you didn't see it.

The other option is to index some code into the page like this:

<meta name="robots" content="noindex">

Hopefully the SEMrush crawler will obey these directives despite what the robots.txt file says. The code injection option often confuses Google Search Console because the page still appears in the automated sitemap.xml file but you're also telling them to not index it — so it'll keep sending you warnings.

Link to comment

Many thanks for replying. The very first thing I did was set it in the SEO module to not include it in search results. Didn't try the meta tag thinking that would be redundant setting that in the SEO module. Just added the meta tag and SEMrush still says "80 internal links are broken", all from that one page being in the main menu. 

The only way SEMrush will avoid crawling it is adding it to robots.txt. Guess I'm SOL. 

Since we can't edit robots.txt, do you think SS Support would consider adding it?

Link to comment
17 hours ago, G-man said:

Many thanks for replying. The very first thing I did was set it in the SEO module to not include it in search results. Didn't try the meta tag thinking that would be redundant setting that in the SEO module. Just added the meta tag and SEMrush still says "80 internal links are broken", all from that one page being in the main menu. 

The only way SEMrush will avoid crawling it is adding it to robots.txt. Guess I'm SOL. 

Since we can't edit robots.txt, do you think SS Support would consider adding it?

When you type out your [domain]/sitemap.xml, does it show the page you are trying to hide/noindex? If it does then your turning it off is (obviously) not working and SSS should be able to resolve. My experience with some 3rd party tools is that despite turning off SEO (for example, for blog, turning off categories and tags), it still gives me duplication and canonical errors and I just ignore those. Additionally, some pages also are displayed with AMP errors by the crawlers of the 3rd party bots which I also ignore.

https://www.google.com/webmasters/tools/robots-testing-tool?hl=en will show how Google reads your robots.txt and I would go by that rather than what SEMRush says.

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment


×
×
  • Create New...