Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Advanced Search

Google Search Console - Valid/Excluded Page Coverage Question


Recommended Posts

Site URL: https://thedailyassist.com

Hello everyone!

I recently launched my website and submitted a sitemap for GSC to crawl and index my pages. It's been a few days and in the Coverage tab it says that only 4 pages are marked as "valid" and 13 as "excluded".  Is there any reason why this would happen?

The majority of pages that are marked as "excluded" say "Discovered - currently not indexed" and "Crawled - currently not indexed". 

I am slightly confused because when I do a site search in Google (site:thedailyassist.com), some of the pages listed as "excluded" are returned as results in my site search, which tells me the pages are indexed. Does the coverage tab take time to update? Or re-crawl? 

Thanks for any help you can give me!

Screen Shot 2021-09-17 at 2.11.42 PM.png

Link to comment
  • Replies 2
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Posted Images

Hi @makeithumme - I'm guessing the valid pages are the homepage, resources, features and subscribe right? And the three blog posts are not indexed yet?

If so, first thing to check is the last crawl date (top right of the screenshot you shared, but outside the frame).

Also for a new site, it's not uncommon for new posts to be slow to get indexed. There are a few tricks you can use to try and speed this up.

  • Use the Search Console "Request Indexing" feature to request that a specific URL get indexed. No guarantees on timeframe here, but it doesn't hurt to do this when you publish new content.
  • Start generating some inbound traffic to the non-indexed URLs from other sources – share the link on social sites, share it in a forum post or two (Reddit, etc...) — while these don't count as official backlinks, sometimes this activity is enough to nudge Google's crawlers to pay attention to the new URLs.
  • If you have a Google property (like a Google My Business profile) create a post that links to your non-indexed URL there. I've had success doing this a few times to get a new post to index when it seems to have been missed by the crawlers.
  • Generate some actual backlinks from other websites – if you can get others in your niche to link to you that will help too.

Let us know if you see a change!

- Ed

--

Blue Hills Digital
Support with website optimization and finding a digital marketing strategy that works.

Latest resource: Website Audit Checklist for 2021 [Template and 15-Step Guide]

Link to comment
2 hours ago, edharris said:

Hi @makeithumme - I'm guessing the valid pages are the homepage, resources, features and subscribe right? And the three blog posts are not indexed yet?

If so, first thing to check is the last crawl date (top right of the screenshot you shared, but outside the frame).

Also for a new site, it's not uncommon for new posts to be slow to get indexed. There are a few tricks you can use to try and speed this up.

  • Use the Search Console "Request Indexing" feature to request that a specific URL get indexed. No guarantees on timeframe here, but it doesn't hurt to do this when you publish new content.
  • Start generating some inbound traffic to the non-indexed URLs from other sources – share the link on social sites, share it in a forum post or two (Reddit, etc...) — while these don't count as official backlinks, sometimes this activity is enough to nudge Google's crawlers to pay attention to the new URLs.
  • If you have a Google property (like a Google My Business profile) create a post that links to your non-indexed URL there. I've had success doing this a few times to get a new post to index when it seems to have been missed by the crawlers.
  • Generate some actual backlinks from other websites – if you can get others in your niche to link to you that will help too.

Let us know if you see a change!

- Ed

--

Blue Hills Digital
Support with website optimization and finding a digital marketing strategy that works.

Latest resource: Website Audit Checklist for 2021 [Template and 15-Step Guide]

Thanks for the reply! It looks like the last crawl update was 9/13. So definitely when I first made the site public and originally submitted the site map. How often does this coverage get updated? Was I correct with the assumption of my pages being indexed and found on Google, even though some of those pages fall into the "Excluded" tab?

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment


×
×
  • Create New...