Jump to content

GlynMusica

Member
  • Posts

    204
  • Joined

  • Last visited

Posts posted by GlynMusica

  1. Nice post.

    The problem is that these methodologies are being used by everyone and the datasets that are being accessed are limited, even if everyone is saying that they come from their own unique source.  Even Google's own keyword suggestions tool will only show you decent data if you are spending money in Google Ads.

    Add to that the fact that nearly every commercial position - IE a keyword that has commercial value - is taken up by a paid-advertising slot, which are then followed by organic positions that are so far down the page as to be almost worthless for ranking and lead-generation purposes. Then factor in personalized listings and the whole thing becomes unmeasurable due to variances based on their past search history.

    I think analyzing the search market for your segment is the most important thing to do and to then isolate the elements of pages as a clue to what people actually want to find against a search phrase. SEO has long-term value but I might be more inclined to run a short-term paid search campaign for keywords the "tools" say are important and then look at engagement metrics of those audiences to help isolate the keyword choice to then recurse back up to the SEO task.  You could do this on a different web-domain as it is just R&D.

    Bear in mind that Google Ads now use what's called close-variants so the keyword you type in might not actually be the keyword the advertiser targeted. I would also make sure that the website ticks all the core web vitals box.

    What I would also say is that if you are launching a website from scratch, try and make sure as much of the optimisation is done as possible so that on the first visit by search you have everything in place. If you have a website and are re-optimising page elements significantly expect up to 4 months to pass before you can reliably consider the pages to have been re-indexed and settled properly.

    This is not replying at the previous posters who also bring their views that are valuable, but I hope this can help others.

    G.

     

     

     

  2. That Wix page is hardly superb: Google is basically saying that CWV will negatively affect ranking. IE organic traffic. Content is always important but if you lead-generation relies on Google organic then the most amazing piece of content simply won't appear anywhere where it will be seen. We have no idea of knowing what the relationship of a good piece of content is versus an override factor that Google might place if that same piece of content has poor CWV scores. It's all conjecture but maybe WIX knows something we don't.

    Our own website scores on Lighthouse (tested incognito) higher than 93 across all tests on both mobile and desktop. However, performance is 68 on desktop and 19 on mobile. There is nothing we can do about that as these resources are locked up in parts of the SS CMS that are not accessible for optimisation.

    I had heard that they were doing something about this, it has certainly been a curve ball for all website developers.

    You can use pre-connect and pre-fetch to improve things a bit, as well as loading some of the standard scripts by hosting locally.

    Everyone that has commented here should write to support.

     

     

  3. The SEM RUSH spider is stupid! Third party tools are generally pretty lame and subject to scanning issues. Best is to use a piece of software for crawl analysis and Screaming Frog is the best for that.

    My advice is to add your website to Google Search Console and include a site map.

    However, if you google this:

    site:https://www.oakpw.com + inurl:blog?offset OR inurl:blog?author

    You will see that Google is not indexing any of the URLS above you mention.

    🙂

  4. It's about having the control to fire the events you want against the conversion types you want. I've not used SquareSpace event triggering as you can define them in GTM and then have complete control, maybe someone else can.

  5. HI there.

    I was wondering if there was a non-commercial solution for creating a vertical timeline in SquareSpace? I have seen that there are some commercial options available, but also read in this forum about some of the problems with these plugins so I am looking for a homegrown solution inside SS that can be used.

    Has anyone found a clever way to create one of these?

     

    Thanks

     

     

  6. Hi @David_GM

    1. The 'hussle' going on at the moment in this space is as follows. Because ad-blockers have been blocking conversions in advance of the IOS14.5 rollout the implementation of CAPI will increase your conversions that historically you were getting no conversion data for. Lots of companies selling CAPI as a solution but in fact it's not a solution. IOS14.5 kills JavaScript cookies, it's why Google is deploying Server Side GTM. I believe that you get a long conversion length when switching do GTM server side, but I've not had to look at this yet.

    2. Not sure, haven't tried it as realistically we are talking about IOS 14.5 devices and while there is a lot of noise about this, you might find that your conversions aren't overly affected. This you can verify doing a sample of Google Analytics device data and modelling what you expect loss to be. You might look at a year's worth of data and cross check with the device version of IOS in use for these conversions and model what you feel the update volume of the current situation is for your business.

    3. Not sure why Campaign Manager is necessary as as solution. Essentially Square Space simply needs to provide the ability to add the Access Token that is created from Facebook Ads Manager and to then tag up the CMS with the various standard events that would be triggered and sent to Facebook.  Thankfully the Standard Conversion Events are very descriptive of what they should be used for so pretty much any Developer should be able to correctly assign the events.

    Consider that most of the digital industry is on the back-foot with these changes that were rolled out by Facebook just after Christmas, even if you could do Server Side before, it was only recently that the ads-manager was updated with the ability to put in the hierarchy of events you wanted reported, when Apple said they would just send one.

    4. I wouldn't remove IOS from your audiences, unless  you can be sure they are all using that. If we saw performance drops we'd just stop targeting those users with specific device versions. You might want to run a parallel audience that is just android users and split test those.

    G

     

     

     

  7. Best tools for doing this type of analysis are

    • GTMETRIX - because they offer good suggestions and easy to follow
    • Lighthouse extension for Google Chrome - It's Google, it's local to your browser so more reliable than cloud.
    • Web.dev - to tick all the boxes....pre-connect etc...

    There is also a GTM hack that will pull LCP values from core web vitals into Google Analytics for you, https://www.simoahava.com/custom-templates/core-web-vitals/.

    JS in SS is heavy, some templates don't need all the assets it but I think they are working on it. I would be really surprised if SS didn't offer some support on this but the rollout of this is between mid June and August if memory serves me correctly.  Also, I do recommend that any third-party codes are removed from your website when doing performance checks, at least that way you can see if the problem is a poor third-party CDN.

     

    Hope this helps

    G.

  8. This might have to do with the way that these links were formatted. By using the PAGE option you can be sure that the right types of links are being added. For example http and https links from an ads campaign can null the source/medium information if they are not consistent.  Something like that SEO spider will help you ID anything that is standing out. You can also force HTTPS on all sessions using the options in SS to make sure that the links are going through https. I'd enable that before running the spider so you can ID those links that might have been hardcoded in the way that you have mentioned and probably would not therefore be rewritten server side by the checkbox enable option on https 🙂

     

  9. You are not going to see any analytics results on that as the redirect will be server side and so probably the analytics code won't fire. Not sure if your analytics is Google (in which case you definitely won't see it) or SquareSpace (not tested this but support would tell you this).

    Google short urls is replaced by FIREBASE, https://console.firebase.google.com/u/0/project/_/durablelinks?pli=1,  which I have not looked at.

    If you simply want to log hits, then the easiest free way I can think of would be to create a bit.ly account which gives you some stats. There might be some reporting tools in there now that give you a way to send this to whomever.

    An other way way, if you wanted to get a log in Google Analytics would be to create a page on SquareSpace that acts as a throughput that has a javascript or 1 second meta-refresh on it to your destination URL. Assuming that your analytics code fires in time to capture the visitors that passes through this page this should get you what you want although there are some breakpoints that could happen, such as non-JS  enabled browsers, etc. I'd also set that throughput page to no-index.

    G.

  10. There is nothing in the technical documentation that Apple has issued to suggest that CAPI is a way to circumvent the measures that Apple has put in place with their privacy update. There are lots of companies claiming this to be the case, but it's simply not true.

    However, by implementing CAPI you will get better reporting and you will get more ROAS because campaigns will have already been affected by popup-blockers and default anti-tracking measures made by the industry, notably browser manufacturers. Also, for those people that opt-in to being tracked, it will provide a second degree of reliability that browser based pixels simply can't match when compared to an API delivery of a conversion event.

    Good luck.

  11. A self referral, which is what you are describing would generally happen if the previous step was being noted as your own domain. So when you are saying that you are seeing your own domain as the self-referral the only thing that comes to mind is some kind of redirection that happens before your main website loads.

    My suggestion is to download screaming frog SEO spider and do some URL analysis and to to check your domain is mapped properly with DNS. I'd also check with support as this is connected to an analytics issue which might need checking by them

    g

  12. GTM uses a different API so will report conversions up to 48 hours after they have happened. Often it is not the case, but you should work that in.

    @DCaamano what you are saying is the fix makes no sense from a tracking point of view because if someone clicks a Google Ad it will be automatically tagged with the GCLID that tells Google Analytics to record the traffic as a Google Campaign. However, we have seen that GCLID performs differently under some very unusual circumstances. The UTM hardcoded into the ad-campaign can be stronger in terms of ad-reporting, but that an ad-click is coming in under organic is frankly something you would want to explore.

    My advice would be as follows.

    1. Set the tracking back to before.

    2. Clear your browser cookies.

    3. Login to Google Analytics

    4. Open Real time view.

    5. Go to Google and click on your ad.

    6. Watch the session appear in Real Time View

    7. Click on location of the session and isolate your visit

    8. Click on Traffic Sources (in real time)

    9. Perform conversion action.

    10. See if there is a change on "traffic sources"

    This test while not perfect will give you a clearer understanding on what is happening to your source/medium attribution.

    🙂

     

     

     

     

     

  13. Backlinks are important and you can define your strategy.  Historically links equalled ranking. They still do, but top of search is now full of PPC so your best case ranking scenario is not worth the investment or traffic. Links between websites are superb, bringing much better quality of visitors. I would look at %share of your goals that are delivered via Organic traffic as a measure of how important it is to  your website.

     

  14. There is always a risk of scope creep but if we look at the Facebook tracking pixel as an example, fundamentally there have not been any changes for years to that. Essentially all that is happening with CAPI is that a payload is being sent via the server to Facebook and via the tracking pixel that share the same ID.

    Where you do see disconnects is where functionalities are being updated. For example a third-party tool that offers the ability to manage Facebook campaigns not in Facebook ads manager. In those areas you are going to put yourself at risk.

    Eventually this will all go away because Facebook will simply provide a closed shop environment and so the only way you can get sales to your business is to open a shop on their channel. Take a peek at the prohibited categories for commerce for IG/FB as a clue to what they want to get their hands on 🙂

     

    G

     

×
×
  • Create New...

Squarespace Webinars

Free online sessions where you’ll learn the basics and refine your Squarespace skills.

Hire a Designer

Stand out online with the help of an experienced designer or developer.