Jump to content

modsquare

Circle Member
  • Posts

    39
  • Joined

  • Last visited

Everything posted by modsquare

  1. @GlynMusica Yeah, there is a lot of additional data that would be needed to make a definitive evaluation as to the "why." All of this data was gathered, updated, and refreshed a few hours right before my post on January 5th. All of these websites are ones I'm intimately acquainted with, having worked on them for a minimum of a year and, in one case, as long as five years. Some of these examples blog once a week, and others every two months. But I felt they were similar enough in terms of quality and the amount of SEO time I put into them that the major difference was the version of Squarespace. Without a doubt, Google is becoming more particular about what they index. And I'm glad about that. Searchers don't need 18 million results for a search; they just need the best ones that answer their query. Again, this was more or less a call for anecdotal evidence from the community to ask if they were seeing anything similar. What I was seeing was so drastic I was looking for some basic correlation before continuing my investigation.
  2. One theory I have is that Fluid Engine might be causing some issues with Google's rendering. I've seen other CMS platforms like ShowIt and Pixieset (I do a fair amount of photography website SEO) suffer SEO-wise due to what I suspect is their layout engine and the way javascript serves their page. I know Google is supposed to be able to handle Javascript fairly well, but it is an unmistakable difference compared to Squarespace 7.0 and WordPress websites. But that theory doesn't fully explain the 7.1 normal pages doing well...unless the majority of these were created in the traditional editor before Fluid Engine was available.
  3. I would love to get feedback from the community on an issue I am seeing. Yesterday, out of frustration on getting some blog pages indexed on one of my 7.1 client websites, I dug a bit deeper. I noticed that I am seeing a significantly lower indexation rate and search results for blog pages built in 7.1 versus 7.0. The normal website pages appear to be fine, but the blog page performance is much different. I decided to make a comparison to see if I could spot an overall trend. I manage about 15 different SQSP client sites, but I would only say that 8 of those put enough effort into blogging to make a decent comparison. Five of those eight sites are in 7.0, and three are in 7.1. I compared the total number of blog posts in SQSP versus the total number of those blog posts indexed by Google and the total number of blog posts that actually get some traffic in a Google SERP (via Google Search Console). The three 7.1 websites have a total of 69 blog posts among them, and only 36% of their blogs are indexed, and only 17% show up in the SERPs, according to GSC. The five 7.0 websites have a total of 402 blog posts among them, 98.5% of them are indexed, and 96.5% of them show up in the SERPs, according to GSC. I realize that the quality of pages is a big factor in terms of whether a page is indexed and receives impressions. However, the vast majority of the 7.1 posts are of equal quality to the 7.0 posts. I've also optimized them the same way. The other interesting thing is that the normal pages are fine — It's only the blogs doing this. I see the 7.1 pages fully indexed and receiving normal impression rates. I see some 7.1 pages getting impressions for hundreds of queries, and then the blog posts get impressions for only 1 or 2. I consider myself to be a fairly experienced SEO guy - I've been doing this for about 15 years and at least eight on Squarespace. So I've seen enough that I feel like I am grasping the data correctly here. Does anyone else out there who does SEO regularly for Squarespace clients who have clients on both platforms have any similar data to share?
  4. Hello Mak! The code that is injected is fine and isn't overriding the page titles/meta descriptions. That code is called structured markup (or schema) that helps search engines understand specific things, in this case, information about your business. I did a quick scan of your site and actually the site is set-up and displaying the page title and meta descriptions as you want them. However, what you are seeing is the new way that Google works. They are using artificial intelligence now to re-write page titles and meta descriptions based on what they think the searcher is looking for, rather than what you want them to see. And yes, this is super frustrating when you have a specific way you want it to look and read. So it's nothing you're doing wrong, it's just how Google is now doing things and at the moment there doesn't appear to be many ways around this. Edited for punctuation.
  5. Yeah, you do have a few options here. This page outlines them pretty well: https://support.squarespace.com/hc/en-us/articles/360022347072-Hiding-pages-from-search-engine-results The easiest option is to flip the switch on the SEO settings tab of the page to tell it to not be indexed. This option is below the fold on the pop-up screen which is kinda annoying and may be why you didn't see it. The other option is to index some code into the page like this: <meta name="robots" content="noindex"> Hopefully the SEMrush crawler will obey these directives despite what the robots.txt file says. The code injection option often confuses Google Search Console because the page still appears in the automated sitemap.xml file but you're also telling them to not index it — so it'll keep sending you warnings.
  6. Hello! I'm seeing that your XML file is working now. Are you seeing the same thing on your end? I have nothing but anecdotal evidence to support this, but I believe that Squarespace's XML files are not generated dynamically. At best they only updates XML files daily. So I suspect that this is why Squarespace was telling you that you just needed to wait. But it's good that you checked this as I discovered there was an issue a few months ago where the XML was not regenerating properly and they opened a ticket and fixed a few things.
  7. I have a client who has been having a sitelinks issues so your post caught my attention. You are correct that there are almost no controls anymore over what Google displays in sitelinks. We used to be able to at least ask to remove something if we didn't want it there, but even that was removed a few years ago. So, Google tries its best to determine what the most important links are on your site. And in your example, they recognized that the main navigation links, which are front and center, were the most important. I don't know that i would add a link in there and hide it and try to recenter everything. In general Google really frowns on hidden text that is only seen by search engines and they are basically spidering sites as a user would see them these days, so it might not even help. I can offer 2 suggestions. (1) Put the a second shop link in the center navigation. I know it might drive you crazy and ruin some aesthetics but at the same time if it is that important and it does the job, then it might be worth it. (2) Add another shop link to the footer of each page in an effort to show google through internal links that the shop page is important to you. Sorry I don't have a better answer!
×
×
  • Create New...

Squarespace Webinars

Free online sessions where you’ll learn the basics and refine your Squarespace skills.

Hire a Designer

Stand out online with the help of an experienced designer or developer.