Jump to content

GlynMusica

Member
  • Posts

    204
  • Joined

  • Last visited

Posts posted by GlynMusica

  1. Google does not use the TITLE tag to show the listings in Google like it did before. This is something that has been going on, from memory about 18 months now. If you read Googles Webmasters blog you will find a post detailing this. Do not expect to be able to define what comes out in the SERPS. This change is most likely driven to give more visibility stand-out appearance to the Advertising blocks on the page, where keywords are inserted.

    Meta description: very difficult. Generally and historically Google would sniff a page for the keyword in the user's search query and then show a text snippet around that text. This happens to a lesser extent now. Microsoft used to be very good at using the meta-description.

    Meta-description length does not have to be within the realms of 150-160 characters long because, there is no direct correlation between the listing and the meta-description because of previous paragraph. The 150-160 character rule was set because it was the length of the available space in a Google underneath the SERPS.  It really had very little to do with SEO as a ranking mechanism, but it was easy and so most tools recommended this for that purpose.

    Therefore you can quite happily ignore this constraint knowing that at least you have more words you can insert into a meta-description to have at least keyword presence and the ability to have some kind of control over it, instead of Google sniffing your page and randomly pulling out text it thinks will support the search query.

    Good luck.

     

     

     

     

     

  2. On 1/24/2024 at 12:20 PM, TayloredData said:

    Hi @GlynMusica that's not the case. You can't load GTM on the secure checkout page (the page where the user pays) since it's a third party payment system, but you can load GTM on the confirmation page where the purchase values are.

    Here's a screenshot of my purchase confirmation page with the GTM debugger open. You can see that I'm able to push the purchase data to the data layer:

    screenshotpurchasegtmdebugger.thumb.png.80412a6db059a01cb162673ddd9ac58f.png

     

    That's cool, thanks for that update.

    I found a really cool tool the other day for helping with the debugging of GTM,  as version 1 and version 2 render slightly differently.

    Take a look at a Chrome Extension called datalayerchecker. It can help surface the specific names of the variables that you need sniff on pages like this.

    Assume that the trigger for that is a "contains" /order + variables.

    G.
     

  3. Hi there:

    site:https://www.kind-living.com/journal/

    Your journal is being indexed.

    If you want deeper pages on your website to be found faster

    a) ensure that you link between blog posts
    b) add a content block to your homepage that links to your latest blog post.

    Discovered but 'not-indexed' statements tend to be about improving your internal linking, or that there is another page on your website that is more relevant that the page you are trying to index - for example that it repeats the content or has similar content. That might not be the specific problem in your case but that is what it generally means and how the above a) and b) generally fixes it.

    Go back 25 years and Google had an add URL page on their search engine where you would submit websites to it. Now you need to have a sitemap and some patience.

    I would not suggest you focus on backlinks, but instead invest your time in writing content for your blog. Google is killing off organic SEO - watch their video on Generative Search and ask yourself the question - where is my organic visibility in this new search context - so you will do better at working on creating content for people rather than search engines.

    I also think you should periodically - once a month - check your site health, but don't spend time following all the whether it is updated or not. You can easily find out this information by looking at your SquareSpace statistics, which are a far better measure of your content quality and determination of how on point it is.

    Have a nice day.

    G

     

     

  4. This has been going on for a few years now.

    Yes SquareSpace needs to make changes to the way they run code on their platform. Will they do it? Who knows. It seems that their core audience may be at the level of being happy to simply have a website rather than look at the technical details of CWV.

    What does annoy me is that I will be paying more in PPC costs simply because some parts of the page experience, that can be easily modified on competing platforms, are off-limits for optimisation.

    I expect that at some time in the future SquareSpace will push the button on a new version that will make good on all the optimisations needed for speed, but I also  think that we will have to rebuild our websites from the ground up on this new template in order to make it work.

    C'est la vie, this is one of the downsides to packages solutions such as SS, and do bear in mind that on my other website the slowest resources to load are Google resources now!

  5. This is a nightmare and so I'll explain what happens as I've had this happen to a Client in the past. Google Ads looks for codes that might be considered malicious in the pages that are running Google Ads. Google Ads represents an immediate way to gain exposure to Google users, and as long as there is budget, there is the capacity to scale and hit lots of Google users. So there is good reason for doing this.

    The experience we had was related to the third-party codes of a provider that, for whatever reason, Google had decided was malicious. What did we do? Pretty much everything that has been documented here: we found no code hacks in the PHP files, and we found absolutely nothing in the source code or JQUERY or other external libraries that were compromised (for this you should use lighthouse chrome extension to see whether any libraries are considered a security risk).

    If you have done the above and still find nothing you should remove any third-party codes from your website. For example if you deliver your codes using Google Tag Manager (which is great for deployment) you just remove the GTM container.

    In our situation the problem had nothing to do with anything on our side, it was a 3rd party provider that had tried to do something overly creative with their code that Google didn't like. Very frustrating on our side as we went through the pain of what has been described here, and they then went on to tell us that they had been aware of it and were trying to fix it (before it affected their Clients!).

    My advice is therefore to remove any code you have added to your pages via the code injection section and to then resubmit your Google Ad. I might even reach out to SquareSpace support to verify beforehand whether or not the CDN that Squarespace uses for the affected website has been cleared just to be absolutely certain that before you send a request for a review to Google there are no straggling pages left online that might be the cause of future pain.

    I'd then write to whomever provides the third-party code and ask them to confirm that they have not been having any similar problems with their clients and to confirm this before you send their codes to your Developer for auditing.

    I would also make sure I have 2FA setup on every account I use that has any ability to add code either via GTM or via the website. For Google Accounts follow all the security settings. I would also recommend never using a VPN, no matter how secure they are declared to be when connect to these accounts, and that goes for any public network WIFI too, where people can just sit and run Wireshark and potentially steal your cookie. I've seen that happen too.

     

    Good luck.

    G.

     

     

     

     

  6. First off I would let Google crawl the website and let it decide, according to their ranking system, which pages are relevant for the keywords.

    By connecting your website to search console you will be able to glean extra information about the keywords and the page that is being associated with those keywords. That is Google's way of telling you which page is the most relevant for the keyword.

    Consider that a Category page will be a container that will contain many products, while a product page will have more specific information. You might decide that there are ways that you can improve the SEO content of the product pages to capture more searches. My advice here would be to go to a website like amazon and look at the filters that they make available for similar products, you can also use their channels as a way to get inputs for your own optimisation.

    Generally Google wants to index everything and then make up its own mind on the relevance. I've seen people cooking their websites by being too restrictive on their website structure using page codes and server redirects. There may be a case for them, but first see what Google thinks.

    In answer to your question, yes having two canonicals shouldn't be done. In an e-commerce setting you might use canonicals legitimately if you had products appearing under two different URLS structures due to them being assigned to two different categories. In this way you would avoid duplication issues, although I would first wait and see which of the two pages Google deems to be the most relevant for the search query, or do extensive keyword research first to isolate which of the two categories you would give priority too.

    Hope this helps. have fun!

    G.

     

     

     

  7. Goto web.dev and run gtmetrix on your pages and follow the tips for how to make them better. These are performance related queries that google rases against your website and which you can intervene to make better.

    There will only be so much you can do because SquareSpace has some inherent optimisation issues that it has not fixed.

    When you have made changes to your website, go into the errors and request Google to Validate them. Then wait 30 days and see if they are fixed.

    Google will use Page Experience metrics to affect things like PPC cost and coverage in their search indexes. And so it is very important to fix, and frankly the fact that SquareSpace hasn't is pretty bad and an oversight on their part.

    In March Google will replace their current metrics with INP. That will bring a whole new host of optimisations necessary for websites.

     

    18.09.2023_10.01.21_REC.png

  8. Instead of SEMRUSH install Google Lighthouse Extension in Google Chrome and use that. If you use Chrome a lot and have extentions, then make sure you enable the Lighthouse extension to run in Incognito and you can then run mobile and desktop tests. You will also get much better information about how to fix the problem and know that it is fixing it for one of the biggest traffic sources for websites.

    You can also run some preconnect in code injection based on an analsis of the network files that your specific SS site pulls in at runtime.

    Good luck.

  9. I imagine you want to use GTM for event firing nothing more as it is way easier to do this inside GTM than via GA4 control panel and gives you the ability to reuse triggers for other analytics such as Facebook and simply more control.

    We have done this in the past with UA and events where the UA configuration was easier for a web-agency to implement on the website and we wanted to track specific conversion events.

    If you have a native configuration on SquareSpace that is doing the page view then don't fire it in the GTM and instead just trigger the event. ie disable the page view event. This will limit your problems. Also do a check on GA4 so you don't have double events firing.

    Be sure that you are adding cross domain tracking values if the GTM tracking needs to go across other domains, don't just set it on GA4.

    Shoudl be fine. Test...obviously!

     

  10. There are loads of unused Javascript libraries that SS loads by default and which cannot be removed or optimized. This mostly affects MOBILE performance scores.

    You can use things like PRELOAD and PREFETCH DNS to make optmisations.

    As this is something that has been discussed for a good 2 years. My guess is that eventually SS will make the changes but they will only be available on new templates and you will therefore have to rebuild everything on those templates.

    Nothing you can do about that.

     

  11. Because everyone is using a very similar keyword data pool, you should be looking at website performance speed and other Core Web Vitals metrics as a priority for low-competition (less than 5M Google Search results) keywords. Google Core Web Vitals and follow the recommendations.

  12. What are you trying to do here, this is not clear?

    GTM code will work across all SquareSpace pages except checkout because it doesn't fire GTM code on checkout page so no way to trigger anything there. Even if you could trigger it you would need to have the data layer variables setup to be able to capture and send the e-commerce value on the page view.

    GA4 is event driven, while it does create a page view for tracking some base level metrics GA4 will capture those events that you setup to capture. We did this already with UA for the past 5 years, so the conversion path in not so hard.

    If you explain in simple terms what you are trying to achieve, I can help you.

    Have a good day.

     

     

  13. This is now fixed.

    Clicking the three dots on that entry in Google and then clicking on cache will give you the date the page was last indexed and a copy of what Google indexed.

    My guess is that had you done that with your previously posted message you would have see that it had indexed the under construction page.

    I can see that on 28th February Google went and visited the page, which is now showing correctly.

    🙂

     

     

  14. A purchase conversion will only be transmitted if

    a) You can fire GTM or GTAG on the checkout page.

    b) You know the variables that SquareSpace uses to write the sale information to the field to transmit to GA4.

    You may be able to setup a goal in GA4 that captures when a purchase if it is possible to add that code to the page where the checkout happens, or maybe after payment there is a thank you page which you can define and then set as a destination URL to trigger a purchase event. You may then (I've not checked but you could in Google UA) if you have an average order value, hard-code the value of a goal conversion into your goal.

    This will bring some information into Google Analytics.

    I hope that can help.

     

     

  15. Note that since August 2021 Google is rewriting the SERP titles so what you have in your TITLE tag may not appear in the SERP as the blue headline.

    This is likely to give their paying advertisers more opportunity to stand out in the search results by being able to include keywords in the Main Highlighted piece of text of their ads.

    Basically pretty much forget having control but as Ziggy says, if your site has a certian critical mass of traffic you may be able to remove "site links" from being shown.

    G.

     

  16. Sorry but I would not do that, I would ditch the cover page and implement hreflang properly and manually across the website There are some scripts that can be used to inject code so that language versions are replaced in the  menus, to avoid having two language menus appearing at the same time on a page.

    You can use FAQs to bump up optimisation on a page once you have established Google thinks that the page should be ranking for. You can use Google Search Console to get a sense of this.

    Here is an example of the code I was talking about
    https://www.bradgood.net/articles/multi-language-content-on-any-squarespace-template

    You might need to setup redirects within the website to remap pages to the new structure, and HREFLANG will help ensure that the right google index is indexing the right version of the website.

    See our website in the signature where we do all of this on our live site.
     

    Good luck.

    G.

     

  17. 7 hours ago, luxannie said:

    I have a basic business name.com and wondering if it still makes sense to buy other domains with keyword rich.coms and do the 301 forward to the main site.  Will it help catch and direct the people searching those keywords or not really? I think its more about quality content now, but is this still helpful to do?  Is it worth the $20 year/domain or does it hurt seo?  I'm super green and trying to learn the basics. thanks!

    Basically this is going to do very little for you.

×
×
  • Create New...

Squarespace Webinars

Free online sessions where you’ll learn the basics and refine your Squarespace skills.

Hire a Designer

Stand out online with the help of an experienced designer or developer.