Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Advanced Search


  • Posts

  • Joined

  • Last visited

Everything posted by GlynMusica

  1. Greetings! - When you moved the website your agency did remap the old page urls to the new site structure right? - You have added a sitemap to Google Search console and assured that it is verified properly. - You have added your sitemap to Bing webmaster tools - You have added Schema markup to the pages using code injection. - You have made sure that your website is setup to force HTTPS - You did not change your web domain - You have waited at least 2/3 months for those types of deep links to appear after a site design while the search engines re-evaluate your website and understand it's semantic structure? - You have optimised your website for Core Web Vitals. If not all the above conditions are met then these are the basic boxes you now need to check. OR You pay for advertising and you can ignore all the above and control how you are displayed in the search engines exactly how you want. Hope this helps
  2. I think that will be borderline impossible without some extreme custom coding and a good Developer. SS will have to fix this eventually. The IOS rollout is happening next week.
  3. I assume you mean Google. The next time Google spider visits it will be pointed to the new page, or if you have a sitemap in Google search console faster.
  4. Consider that every piece of content you produce has a cost. There were companies that spend millions on getting likes, and the Facebook removed them. Whether Google of Facebook the goal is the same, affiliation of your business where they take a cut. If you want to see what I mean go and take a look at all the services that may NOT be sold via Facebook Shops. Therefore my advice is to always prioritize your own business, if you are sharing content that has value on a social network as someone has mentioned with their Google blog strategy, make the post link to the nuggets on your own website domain. where you can commercialize the visitor in some way. If you are doing it as a passion, then that's different, I am talking exclusively about businesses that use Social as part of their strategy. Also, know your demographic for who you want to attract. Every platform is slightly different - although not as diverse as what their media-packs would have you believe. Social networks are a vehicle for however you define them to be as part of your business. Define that first, and then match the social network with your goals and good luck.
  5. You can get a SS website to deliver on all the CoreWebVitals but it needs some really advanced code optimization. We have a 943ms load-time and a cumulative layout shift of 0.05 now, using advanced code injection and pre-connect and preload headers, combined with serving things like Jquery through non-google CDNs has shown that we can tick nearly all the boxes on this latest hoop jumper. Here's a helper: https://web.dev/preconnect-and-dns-prefetch/ I am actually pretty sure that SS has been doing some stuff already because there were a lot of Unusued Javascripts that were loading in my theme by default but some of these no longer seem to be loading - still quite few that are though - at least on homepage. I would not expect SquareSpace to do an announcement on this, because it would be like saying "hi everyone, we've optimised our code delivery and fixed the fact that we were loading a whole lot of unnecessary javascript libraries that our programmers didn't catch" I mean what company would issue that press-release. Hope that helps.
  6. This might help https://support.squarespace.com/hc/en-us/articles/205815508-Using-Mailchimp-with-Squarespace
  7. It's a very weird policy this one, or is it more of a disclaimer that you might actually look every so often but are not guaranteeing it? My advice would be for SS to find a way to create a features tracker that people can submit, who knows how many feature and enhancements you are missing because you don't have this. For this particular thread you always need to test and check functionality on systems fully before buying into them, otherwise you can find yourself out of luck trying to customize it. Even if the features might seem important or missed the owner company might have different goals to diversify their product offering which take precedence. Don't you think that is a nuts thing to say
  8. Congrats. I would not wait for organic results, better instead to go and tell people with relevant websites - education institutions come to mind - about your great resource. Consider now that in many verticals a first place organic result is going to bring you very little in terms of traffic due to the fact that paid search results are pushing all the free traffic off the page. That doesn't mean there are not opportunities, and it might be that in your own area of the keyword web - I've not checked - you still have the opportunity to get free traffic. I'm not saying you won't get free traffic, typically we see as much as 50% of all traffic as being organic, it's just trying to isolate those results, that are now heavily personalized by users (yes you can remove this when doing search analysis) so make it very difficult to actually see growth and measure against specific keywords. I'd say almost you should ignore Google as you have a resource that lends itself very favorably to off-search types of promotion. In answer to your specific question, if the website is brand new then you can expect to see a stabilizing of your pages in 1-2 months, if you are re-optimising existing pages then you might have to wait up to 6 for them to properly stabilize. Games work well on Social Media, I'd be looking at Facebook or perhaps LinkedIn if you can afford it. Good luck. G.
  9. SS needs to do some work on this. Carries over to member areas too where if a person misses the approval to marketing email. No way to send again. No way for the person to request again.
  10. Correct, currently by entering the FB pixel ID in the SS field area of the admin section you are only going to fire the browser event. This is what you would see. If you have successfully managed to do the Payload plus the field in SS in place, what you would expect to see is Browser + Server, but because you are not triggering with the same ID you would have duplicate events which were not possible to de-duplicate. Facebook Events Manager would tell you this. The best thing to do for testing is going to be as follows: 1. Remove the FB pixel from SS - wait a day so you have no data for a single days date range. 2. Go to business manager and make sure all the diagnostic messages and errors are set to fixed. 3. Do implementation of both CAPI + PIXEL code, these would be both fired at the same time. Send a few events to Facebook and see that you get Browser + Server. The problem I has is trying to understand how you create a random ID that you then assign and send via the pixel and CAPI, this is key for it all to work and avoid dupes. Hope this helps.
  11. Thanks for sharing that message. Unfortunately when Google says jump planet earth now says "how high?"
  12. I'd be interested to see that code when you get it working. To you questions 1. I am not sure on this but I think that each event would be triggered in a different way. As typically you would setup a Standard Event (such as purchase, but for testing I would recommend you simply use a view-content event which you can trigger on all pages and means you will get event data through more quickly) to fire on a given action, such as clicking a button. 2. In the Events manager you will see under Connection Method Browser (pixel) + Server (CAPI) under the sources. When you see that coming through mouse over the graph and you will see the events by connection method, and if there are same events they will automatically be de-duplicated. For that de-duplication to take place you need to fire both the pixel and send the CAPI conversion event using the same event ID. So if you have the FB pixel loaded into the SS options, you should probably take that out as you lose the control to set a unique event ID across two events that are not being controlled from the same source. In other words somehow you get SS to create a unique ID for a user sessions and then send that EVENT ID using the pixel + CAPI event. I'm not sure if that helps you or simply confirms what you already know, difficult to know this from your questions, but please keep me looped in 🙂
  13. Greetings. In a form when in editor mode go to advanced settings tab and set redirect URL, which you can set for goal or event trigger in GA 🙂 Post-Submit Redirect If provided, the user will be redirected upon form submission.
  14. I would write to them in an email support, not virtual chat. And if you are reading this and haven't do so too.
  15. I would not use SEM RUSH for validating markup. Better to use this: https://search.google.com/test/rich-results or https://www.jsonschemavalidator.net/
  16. There is something weird going on with your website. I can see it renders in the browser, so when you to go it there is no problem looking at it, but when accessing it with a specialised tool you can see that there is a CLIENT ERROR. This does not happen normally. In the first instance I would open a support ticket. In the second instance I would create a Google Search Console account, validate the web-domain you own, and then add the sitemap to it. Then wait and see if you get any messages in your Search Console Account about it. If you have added any custom code to the website, I might think about checking or removing that and then running the text below again to see if that solves the problem. The below tool you can download for free.
  17. Yes. Although chances are Google will still index it and simply not show it and then tell you in Google Search Console that the page has a problem because Google can't index it! If you want to hide a page from Google you can put a password protection on it to be absolutely secure. Or don't put it online at all 🙂
  18. Yes, can be ignored unless you want to associate a value to your signups. For example a company might charge you a fee for each subscription, so in this case you could assign a value via these extra parameters which would come through as a value. For example Company charges you £5 a lead, and you value that lead at £10. You send parameters with values and it allows you assign a value and have it appear in Facebook. Not sure if it is also used as part of Lifetime Value metrics.
  19. I'd be surprised if a CORE WEB VITALS value is the sole reason for your traffic loss, but I'm not Google and I know how they do like to get people to jump through hoops. It seems more probable that either someone has started paying for ads or some other design option on the Google Serps is the likely cause. I'm not sure the extent to which you can verify this. You might also, if you use google analytics look at which type of traffic shifted at the dates above, this can help you isolate whether it's desktop/mobile etc. There is only so much you can do with SS optimisation, we're now left pretty much with assets we can't optimise - Unused JS bundled into the themes as they form part of a shared codebase, even though they are not used in the live site. Idem for some fonts loaded via CSS. "Reduce the impact of third-party code Third-party code blocked the main thread for 2,470 ms" - these are all SS assets that cannot be accessed for example. I have written to SS them to make aware of this and hope that it will get a look in at some point, this is particularly poignant if you are providing some scientific way of saying that the drop is caused by such things, and I would write to them. For your own specific case I would be grateful if you can post any updates here.
  20. It's not an issue because GA4 while claimed to be production ready there have been data problems, so take the advice of industry and run this alongside your existing UA, and start to get to know it. We are looking at a migration path that we see will not become compulsory for 12 months at the very earliest, and probably the timeline will be much longer. consideration the migration path from GA to UA. This will be accelerated by things like cookie-less tracking via browsers so it's not wrong to be looking at it now. If you want to run GA4 via SS the best way is going to be as a stand-alone code injection via GTM if you are using that. I would not be relying on GA4 as a primary analytics measurement tool. This is the setup with our Clients and over the next 12 months we will migrate it. I hope this helps frame it. G.
  21. Just to respond on tracking, if you are technically minded you can fire all tracking using Google Tag Manager and the Event parameter. You load that via code injection and then trigger events to be sent to Google Analytics. It's free, but it might be too technical as I said. As Iframe is off site you can only embed tracking into that code which if not supported you won't have any luck unfortunately. However if you were able to redirect form submissions to a thank you page where you could install Google analytics, such as you SS website, you'd be able to calculate things like conversion rates using a custom metrics. Good luck
  22. 1. There's a setting in Squarespace site settings that will let you force HTTPS - HSTS secure: https://support.squarespace.com/hc/en-us/articles/205815898-Understanding-SSL-certificates#toc-choose-ssl-settings 2. Read this as the setting in Google Search Console is no longer there: https://searchengineland.com/google-search-console-drops-preferred-domain-setting-318356 3. add both http and https to your GSC account, add sitemap to https as well Wait for Google spider to come around again, should take up to a month. Make some changes on your pages (add a bit of text) and it may come around sooner 🙂 G.
  23. @tuanphan hello, here's an example I have a piece of JQ on our blog that tidies in some way, maybe you have something like this for the shop? Thanks
  • Create New...