Jump to content

CaroleDC

Member
  • Posts

    8
  • Joined

  • Last visited

Everything posted by CaroleDC

  1. Only 3 of my blog posts are "duplicate content" so why aren't the others being indexed?
  2. That I could understand but there are plenty of blog posts that *aren't* duplicate content and still aren't being indexed.
  3. Yes, my sitemap contains each and every blog post. According to Google Search Console, Google is aware of them all since I uploaded the sitemap but it still isn't indexing them. I don't know why and I don't know what else to do to get them indexed.
  4. I don't know how to use the search console to get currently unindexed pages indexed. I added my sitemap in November but it's not helped get any of my blog posts indexed
  5. I've been wanting to improve my website traffic so I've been looking at SEO. I've got descriptions etc set for each page and for my blog posts. Yet when I Google my website domain, most of my blog posts are missing! No wonder my website doesn't get much traffic on my blog. We're talking about over a year's worth of content so there has certainly been enough time for them to capture them. My SEO settings on my blog make all posts/pages in my blog available to search engines. Does anyone have any suggestions about what's going on and what I can do to correct this? Edit: Bing has more results for my blog than Google but still not all
  6. I was also hoping to drip release content to members and can't seem to find any settings to do this. Have I missed something?
  7. Hi everyone, I'm currently setting up a course with multiple modules on my squarespace website. I would like to find a way for members to be able to track their progress so they can easily pick back up where they left off - is there a way to do this? Thanks
  8. Site URL: https://www.caroledianecoaching.co.uk Hi everyone. So I'm trying to speed up my website as it's been pointed out to me that it can take up to 15 seconds to become interactive when running website grader tools or site speed checks. Pingdom has highlighted issues with DNS lookups, HTTP requests, Expires headers and file compression (as you can see in the screenshot I've attached). Could you help me understand what I can do to address these? I don't know how to reduce the number of HTTP requests - any advice anyone? I haven't got much custom code at all on my site after having removed most of it as it was unused so I don't know where to look for this. I have a similar issue with the DNS lookups. How can I look at reducing these? I've looked into it and I can't add Expires headers or Cache-control because with Squarespace I don't have access to the file that these would go into. (I did try to put Cache-control into the site coding but obviously it made no difference.) For file compression, I have reduced the sizes of my images and could further reduce them if it could help. My youtube video has a custom thumbnail which is a small jpg file so my understanding is that that shouldn't be a problem. I have looked into gzip but I am not particularly well versed on things like this and it seems that I would need access to a file that I just don't because of how Squarespace works... I found Squarespace sooo helpful for getting everything set up but now I'm running into problems trying to optimise it, I'm beginning to wonder if I made the right trade-off lol.
×
×
  • Create New...

Squarespace Webinars

Free online sessions where you’ll learn the basics and refine your Squarespace skills.

Hire a Designer

Stand out online with the help of an experienced designer or developer.