Category: Tools

Dynamic Content Generation with Google Sheets

Dynamic Content Generation with Google Sheets

If you haven’t been following along, I’ve been documenting my journey to a 1,000+ page programmatic site in WordPress using Google sheets.

The build is heavily reliant on Google Sheets to generate dynamic content based on templates.

The content there is a little more advanced, as it is leveraging rules to determine which content template is used.

I thought I’d take it back to basics, and create a dynamic content generator that was a bit simpler.

A straight-up data-to-content text replacement.

 

Dynamically generating text in Google Sheets

 

The data set

The generator starts with your data set.

Each column represents a different piece of data.

You wouldn’t need to use every single of pieces of data you put in the spreadsheet, however, you will need to ensure any data point you do want to use is there.

Each of the columns is a variable name, and that will be taken by the generator and replaced out.

 

The dynamic text template

On the second sheet, is a cell where you write out your text template.

You just write out the text you’d like to include, in the format you want it, and include the variables wherever you’d like them.

Next to the text template is a list of all your variables.

These are just a list of every header name from the first sheet, and it’s a great list to help you remember what you have to work with, rather than needing to flick back and forth between your data set.

Since it’s formula driven, you can’t copy/paste the variables. However, if you’re not adding more columns to the dataset you could paste the raw data so that you could just copy them in.

 

The content generation formula

You’ll find the actual formula that does the replacement on the main generator sheet.

The formula might look a little daunting, but it’s just a large nested substitute.

Each heading has “<” and “>” added on either side to convert it to something to use in the text templates.

The formula will then take these variables, and substitute them for the value in that column.

It will then repeat the process.

To add a new variable in;

1. Insert a column before the contentOutput and fill in your data.

2. Add an additional SUBSITUTE( at the front of the list

3. Copy the data after the 2nd to last bracket, and paste it after the last bracket

4. Modify the cell references to instead reference your new column

So if you inserted one into the current template, the formula would go from;

=SUBSTITUTE(SUBSTITUTE( SUBSTITUTE(SUBSTITUTE( SUBSTITUTE(SUBSTITUTE( SUBSTITUTE(Template!$A$2, “<“&$C$1&”>”,$C2), “<“&$D$1&”>”,$D2) ,”<“&$E$1&”>”,$E2), “<“&$F$1&”>”,$F2), “<“&$G$1&”>”,$G2), “<“&$H$1&”>”,$H2), “<“&$I$1&”>”,$I2)

To become;

=SUBSTITUTE(SUBSTITUTE( SUBSTITUTE(SUBSTITUTE( SUBSTITUTE(SUBSTITUTE( SUBSTITUTE(SUBSTITUTE( Template!$A$2, “<“&$C$1&”>”, $C2),”<“&$D$1&”>”,$D2), “<“&$E$1&”>”,$E2), “<“&$F$1&”>”,$F2), “<“&$G$1&”>”,$G2), “<“&$H$1&”>”,$H2), “<“&$I$1&”>”,$I2) ,”<“&$J$1&”>”,$J2)

Note the blue for the new modifications.

Since the new column of data was J, we copied the end and then changed the I to a J.

 

Content template preview

A preview column has also been added to the text template tab.

Once your generation formula is set up with all your variables, you will be able to see a live generation example as you edit your template.

This will allow you to see exactly what the text template looks like when you substitute some of your real data into it.

The example will randomly select from all of your examples every time you update the cell.

A great way to ensure you’re accounting for different outputs in your template, as you craft it.

 

Access the Google Sheet

You can access the dynamic content generation Google Sheet at the below link.

Let me know if you have any questions!

Feel free to leave a comment, I’d be happy to help out.

 

Using Data to Determine What Filters Should be Targeted

Using Data to Determine What Filters Should be Targeted

Bedrooms, and bathrooms, and price ranges, oh my!

There are so many filters that could be used in the pretty URL, but what should be used?

What should earn the “pretty” status, and what should be relegated to query parameter status?

Some may say everything goes in pretty.

Some may just pick a few, and leave it at that.

Well, let’s take a look at how you could use data to inform your decision.

I’m not going to go into why you shouldn’t be creating pretty URLs for everything, that’s a separate post.

What we will run through are ways to set up keyword data, for you to gain insights about search trends of the filters you’ve got, so that you know what you should be optimising for.

The data you’ll need

To run this analysis, you’ll ideally just need an extremely large keyword research piece specifically for your niche.

It will need to be rather large to ensure you can get solid data, and you’ll also need to have it pretty clean, or at least understand the caveats of your data.

If you’ve also got a tonne of GSC data to mix in, then that would be great. That will help extend the data to cover a good portion of the 0-volume keywords that might not show up in keyword research tools.

For my examples below, I just spent 20 minutes pulling together some real estate keywords for 3 Australian cities, and ~15 suburbs from a “Top Suburbs in Melbourne” type list.

I wanted to limit it to location-specific keywords, as without a giant list of seed locations it can be hard to compare location v not location without significant time spent. Makes things messy.

 

Setting up your data

Whether you’re using Excel or Google sheets, you’ll need to create a data table to power the analysis.

The data table should contain all the keywords, their volumes, and associated categorisation.

 

Create categories for each filter

You’ll need to create categories for each of the filters that you are trying to analyse and work out whether they should be optimised for.

Go through and create columns, and set upYou’ll  their categorisation with rules for each possible value so that you can capture as many as possible.

For my example, I am using real estate keywords. A column has been created for each filter I’d like to analyse, along with categorisation for each of them.

Each filter has its seed values assigned, along with what the value is for that variable.

If a keyword contains either ‘buy’ or ‘sale’, it gets flagged as “Buy”.

If a keyword contains ‘1 bed’ or ‘1br’ it gets flagged as 1 Bedroom.

You can read more about how this works here.

You’ll want to be as thorough here as possible, and include as many variations as possible.

A couple of missed ones could really sway a decision.

Try and also create a catchall category at the end of filters with only a variable or two.

I created one for ‘features’ based on what I was seeing in the keyword data.

 

Prefiltering / cleansing the data

Depending on how clean your keyword data is, it might be better to just look at a portion of it.

A portion you know is 90% cleaner than the rest of the data.

For my real estate keywords, I know that if the keyword includes a channel, so ‘sale’, ‘buy’, ‘rent’, or ‘rental’, there is a higher chance of it being a keyword of enough quality for the study.

To include keywords that don’t include a channel (like ‘real estate’ or ‘properties’), I also include keywords including a bedroom or bathroom filter.

This is done via a YES/NO filter, that just flags it as YES if any of the filter cells have something inside them.

All my data analysis will have this filter applied, and it brings the keywords down from 9,000 to just 2,000.

I know those 2,000 are trustworthy to tell a story.

 

Creating your pivot tables

You’ll now need to create pivot tables for each of them so that you have a way to read the data

Go through a create a pivot table for each of your filters with the below data;

  • Filter as the row
  • Search volume SUM as value
  • Search volume COUNT as value
  • Search volume COUNT shown as % of grand total as value

The SUM should be obvious, being that it will be the total amount of search volume for each of the filter values.

The COUNT will be how many times that filter value is used among the keyword set.

The COUNT & % of Total will show us the actual % of keywords that use this filter value. A little quicker to analyse than the overall count alone.

 

Analysing and selecting your filters

Now we’ll get to read our data and see what we can make of it.

Let’s take a look at my property-type keywords.

We can see that of the 2,000 keywords included, 85% mention a property type. So only 15% are more generic keywords like ‘properties’ or ‘real estate’.

Even if you consider ‘homes’ as generic, that’s still less than a quarter of the keywords without a property type.

So yes, property type 100% needs to be optimised for.

 

Looking at the features keywords.

Only 2 keywords include pets, 2 with pools, and then 1 mentioning granny flat. If these were the only filter values available, I would not be optimising for them.

Similar story with the bathrooms keywords.

Only 2 keywords contain a bathroom-related phrase. Probably wouldn’t recommend targeting that in bulk.

Now onto the 2 that are a bit contentious when it comes to real estate sites.

The first one being bedrooms.

Bedrooms is one I personally recommend against optimising for directly under normal circumstances. At least at first anyway.

I feel it creates too many variations of URLs, with not enough reward/value in return for doing so. Can be worth targeting once all indexation/crawling boxes are ticked, especially with some rules in place, but maybe not directly out the gate.

In saying that, looking at the data 10% of the keywords (7% of total volume) include a bedroom value.

Is that enough to warrant targeting of it? Maybe.

But if we break that data down a bit further, and split out the city (Melbourne) from the ~15 suburbs, we see something a bit different.

16% of the city keywords (14% of volume) contain a bedroom term, verse only 5% (1% of volume) of the suburbs do.

So that’s 1 location have a significantly larger amount of keywords including it than the 15 other locations combined.

So if you create pages equally amongst cities & suburbs, you’re going to be creating significant volumes of pages when only a small portion of them will be useful.

Yeah, long-tail value this and that. I’m not saying definitely don’t, I’m just advising against it without restrictions in place.

A similar situation is with the prices.

Pretty low volume for the majority of the keywords that include a price (normally ‘under xxx’ type keywords).

And if we break it into city vs suburb, we get;

None of the suburb keywords in this data include a price. It’s only at the city level.

 

Why some filters may not be worth targeting

I’m a big believer in limiting crawlable URLs where possible.

Minimising re-use of listing content, avoiding the possibility of confusing Google – too much.

Keeping the site as small as possible, whilst still targeting as much as possible.

So why would I recommend avoiding creating bedrooms or pricing optimised URLs in bulk?

Well, it comes down to page count.

Crawlable page count to be specific.

Let’s say you have a location set of 5,000 locations.

10 property types.

and 2 channels.

You’ve already got 100,000 crawlable URLs right there.

If you then have 7 bedroom options, you’re looking at 700,000 URLs in addition, to that 100,000 that exist, that Googlebot will have to constantly trawl through.

Is it worth enlarging your site by 700% to target an extra 7% in search volume?

If you think so, then go for it.

That’s also if you do it cleanly. If you have other filters with crawlable links on the site, that overall crawlable URL count will only increase.

So if you’re creating significant page volumes off of smaller % filters like this bedrooms count, you must ensure you have your crawling well in check before you launch.

That way you can avoid exacerbating any existing issues.

There are other ways of efficiently targeting these types of keywords though.

In particular, I recommend a targeting strategy here on how to target these filters that may have value at key locations, and not others, by having a couple of tiers of locations.

 

Picking your filter values

To try and keep some filters in check too, you can also optimise the system so that only certain values of a filter get optimised.

Using the bedrooms as an example, you might choose to just create pretty URLs for Studios, 1 bed, 2 bed, and 3 bedroom apartments. 4+ bedrooms would then be relegated to the query parameter, and not receive the internal links pointing into it.

 

Let the data guide your optimisation

By leveraging this keyword data you can really gain an insight into what filters, and values, you should be optimising for.

Plenty of caveats, particularly around longer tail keywords that tools won’t give you, but there should be more than enough data to at least guide an initial decision.

It’s also easier to expose a filter later on, than to clean up the over-indexation caused by one if it needs to be reverted.

There’s also the other question here, is it even worth putting in the work to have separate ‘pretty’ and ‘parametered’ filters?

I’ll leave it to you to decide.

Bulk Download Images from URLs – Convert & Rename

Bulk Download Images from URLs – Convert & Rename

Batch downloading images, and renaming them, isn’t something you’ll use often, but when you do, doing it in bulk can really save you quite a bit of time.

Why would you want to do it?

Well, plenty of reasons. From migrating images, wanting to bulk optimise their file names, or downloading imagery from brands for products you sell.

Using the attached Excel file, you can insert all the new image names, along with their current source URLs, and the macro will download all the images, give them the new names, and also save them as .jpg files no matter their source extension.

How to batch download & rename images with Excel

The following is the process on how to download images from url in excel and rename them;

1. Download and open the Excel image URL downloader

2. Click on Developer > Macros and then hit edit on the selected one in the file

 

3. Edit the folder path that is highlighted, with it needing to point to a folder that currently exists. This is where the newly renamed images will be downloaded too. If you point it to a folder that doesn’t exist, it won’t visually create the folder. However, if you create the folder after the fact, all the files will be there. So, yeah.

 

4. Insert all the new image names in, ensuring the new name includes dashes, and doesn’t include a file extension

 

5. Click on ‘macros’ and then click run on the selected macro

 

You’ll now get ‘File Downloaded Successfully’ on all the files that have downloaded from the websites okay.

 

You’ll see an error if it doesn’t work, with most errors I have seen attributed to being blocked by the source you’re scraping. Depending on the use, you can ask for your IP to be unlocked, but sometimes it’s just quicker to throw on a VPN.

Another issue I have seen is that if you include an extension in the name, you will end up with double image extensions like .jpg .jpg. You can modify the script by dropping the .jpg extension that’s included if you really want, but it’s just easier to exclude the extension names.

Download the Excel Image Downloader
  You will receive a download link via email. We hate spam and never share your details.

Successful bulk image download from URL & Rename

This URL image downloader should have now successfully downloaded and renamed all the images you will need, extremely fast.

Bulk Redirecting URLs in Cloudflare

Bulk Redirecting URLs in Cloudflare

For years, Cloudflare has had the page rules section where you can setup redirect rules. But you’re only allowed 3 page rules per domain on a free account, which severely limits the redirect options so many don’t use it.

However, at the end of 2021 Cloudflare released its new bulk redirect setup.

These extend the Cloudflare redirect functionality, and offer more flexibility with one-to-one redirects, rather than just a handful of rules.

 

What is the Cloudflare bulk redirect feature?

The bulk setup allows for the creation of redirect lists, where you can have significant quantities of one-to-one redirects setup.

These aren’t redirect rules, they’re literal one URL, to another URL redirects.

They’ve mentioned the following in their launch post however;

There is still a requirement for supporting more complex, rich URL redirects, e.g., device-specific URL redirects, country-specific URL redirects, URL redirects that allow regular expressions in their target URL, and so forth. We aspire to offer a full range of functionality to support as many use cases as possible.

So hopefully that means either this functionality, or additional, will be launched to support more bulk managed redirect rules.

 

How to bulk redirect URLs inside Cloudflare?

This is how you create a set of bulk redirects in Cloudflare.

1. Log into cloudflare, and on the overall account management screen click on ‘Bulk Redirects’ in the left menu

2. Create a new redirect list first, so click the text link and then click ‘create new list’ on the new page that loads

3. Enter a name & description, and then make sure you click on ‘redirect’ content type and not the default IP selection.

4. You’ll come to a config page, where you can click ‘add items’

5. Enter all your one-to-one redirects manually, but entering the source and the target URL. You can also just upload a single CSV that has these two columns, and import that directly into cloudflare. Keep in mind that HTTP and https URL versions will be treated separately, so either add both variations or have a separate page rule handle the HTTP > https portion.

6. Once you created your list, go back to the bulk redirects page, and click on ‘create bulk redirects’ button

7. Add a name, and select the new redirect list you just created at the bottom of the page, and click ‘save and deploy’.

 

Once that magical save and deploy button is hit, your redirects should pretty much be live. Go and test them at httpstatus.io or a similar tool to ensure you’re not cached, and you should see the redirects live within a minute or two.

 

How many bulk redirects can be set up?

Cloudflare offers completely free accounts with just the 20 bulk redirects, however, if you have just a single paying property (20/month) you get access to the 500 across all sites as these are at the account level.

Definitely worth having a single account upgraded for these alone, considering how many you can have, and how easy they get set up.

 

Should you be using this new Cloudflare feature?

It really depends on what you’re trying to achieve, and whether the quantities allowed fit into what you need.

If you have a dev team, and a full server setup, then yeah this won’t be what you want.

If you’re just moving URLs around on an existing site, then probably not either.

But if you’re deprecating one site, and merging it into another, where you don’t want to have any hosting maintained for the old domain, this is exactly the product you need.

Caching HTML in Cloudflare for Improved Page Load Speeds

Caching HTML in Cloudflare for Improved Page Load Speeds

Caching HTML can help you reduce the overall page load speed, as the CDN can hold your HTML rather than have to request it from the server each time.

There are pros and cons to doing this, but if you’re a static site that doesn’t change too often, caching your HTML could be the perfect solution to improving page load speeds a bit.

 

Does Cloudflare cache HTML?

Yes, however not by default. You need to enable it with the below steps to create a page rule.

 

How to cache HTML with Cloudflare

Caching HTML in Cloudflare is easy, so just follow these steps.

1. Open cloudflare, and click ‘page rules’ in the sidebar

2. Click ‘create page rule’ to the right of this page

3.  Insert your sites domain with a wild card on the end, just like the cloudflare example, and then select ‘Cache Level’ and ‘Cache Everything’ from the dropdowns.

And you’re done!

You should be able to run a test now, and compare to before HTML caching.

For me, here is my test.

Before HTML caching

After HTML caching

Yes, the site is already optimised, but a few key metrics were improved even further.

FCP dropped from 305ms to 112ms.

TTI dropped from 305ms to 112ms.

LCP dropped from 500ms to 444ms.

Not too bad at all!

 

Is it good or bad to cache HTML with Cloudflare?

It really depends. How static is your content?

If you have an extremely dynamic site, or even a news homepage style site, caching HTML might not be good for you.

Rather than a consumer constantly getting fresh content, they will get the HTML cache version until the cache is refreshed. This means they’ll only see the content available at the time of cache, and not anything that has been added or edited since.

Not just consumers, but Google too. We want Google to have the freshest content, so you might be holding yourself back by a day or two, along with annoying users, if you enable HTML caching.

However, if you’re site is extremely static, HTML caching can really help you knock off a few hundred milliseconds off some key core web vital numbers.

 

Use HTML caching wisely, and it can help you improve your page load speeds… provided you’re a good fit for it.

GTMetrix Review: Top SEO Insights You’ll Get

GTMetrix Review: Top SEO Insights You’ll Get

One tool I will use every single time that I audit a website is GTMetrix.

I’ve seen comments about it, and that other speed testers like webpagetest.org provide “better data” and “more insights”, but GTMetrix does everything I want and helps me solve my problems.

As with every tool, you take its automated insights with a grain of salt. You leverage them, to guide further insight gathering, or to back up a specific decision.

GTMetrix gives me the data to pass on to dev teams, and help get issues patched.

 

Running a Speed Test with GTMetrix

It’s pretty simple to run a speed test in GTMetric.

  1. Enter the URL you’d like to test.
  2. Change the location you’d like the speed test run from. Highly recommend you get this as close to your audience as possible.
  3. (OPTIONAL) Change the browser you’d like to test from. This is where you can also select a mobile browser if you’d like to run a mobile speed test.
  4. (OPTIONAL) Select the speed if you’d like to throttle the speed test. Throttling can help show more “true to life” bottle necks, like with a poor mobile connection, but will also help smaller issues show up easier as everything will get exacerbated.
  5. (PRO REQUIRED) Ensure you have the video test flicked on, if you’re a pro user. You’ll get a bit more useful insight.
  6. Click Analyze, and run your website speed test.

How to run a speed test with GTMetrix

 

How to interpret GTMetrix’s waterfall chart

The waterfall chart breaks down the exact points at when different resources are called, connected to, and downloaded.

Each resource is ordered based on when its loading starts.

All you have to ever really worry about here is when a specific resource is connected to, and when it finishes being downloaded. There are very few use cases you’ll run into as an SEO where anything in between is required.

GTMetrix waterfall chart example

Look through what items are being loaded and when, and then run through the standard checks from here to optimise.

Large files being loaded? Are specific requests taking too long? Too many files? External requests you didn’t know about?

Plenty of things to analyse here, but they’re very specific to each audit.

Each significant request stage is broken down by a coloured line. The following is what stage each of the coloured lines in the waterfall chart represent;

GTMetrix waterfall chart legend

You can find some more info on the waterfall chart from GTMetrix directly, here.

 

Page Load Video by GTMetrix

One of my favourite features, particular to help better identify CLS issues,

Unfortunately, it’s a pro-only feature.

In saying that, it’s worth it to help out with these audits.

When running your original test, you can tick on video audit.

Or, when viewing an audit you can click ‘enable video and re-test’ and GTMetrix will re-run the test, including the video test this time.

Video of pagespeed test

Here’s an example video output, from the test I ran above of SammySEO.com

You can play/pause the video, or run it at 1x, 1/2x, or 1/4x speed.

 

Testing Core Web Vitals with GTMetrix

When you run a GTMetrix test, you get a basic overview of your core vitals.

Core web vitals testing on GTMetrix

You can extract a little more information from the waterfall chart, and a few other places in GTMetrix, but this overview can help you delve into each specific CWV separately.

 

GTMetrix Alternatives

So, if this isn’t the tool for you, what other options are there?

The top few that come to mind are;

 

Is GTMetrix pro worth it?

Yes, I believe GTMetrix pro is worth it. Well, for me anyway.

If you have an alternate speed tester you’re using, then it probably wouldn’t be.

Other tools have similar features, I have just used GTMetrix for years now, so have gotten used to it and understand what to look at a bit more than the other tools.

Removing the Collection from a Product URL in Shopify

Removing the Collection from a Product URL in Shopify

I was doing a quick audit of a friend’s website recently, and discovered something rather annoying about the site.

On a collection page, Shopify added the slug of the current collection, to all the product links on that page.

So every time a product was listed in a different collection, it would have a different link pointing in.

Each of these links correctly pointed to the primary product URL with the canonical tag, but that tag doesn’t work like it used to.

Google is also assigning weight to each of the links coming into the page.

This caused a product URL that contained the collection, to compete with the actual product URL.

The URL from a collection page was:

domain.com/collections/<collection>/products/<product-slug>

Yet the actual product URL was;

domain.com/products/<product-slug>

 

Why does Shopify add the collection or a product URL?

There are a couple potential reasons, with the main stemming from the user experience.

When the collection is added on the URL, breadcrumbs can get updated to point back to that collection.

If a user enters from 2 different collections, they’d see two different breadcrumbs, and then be able to go back to their original ‘search’ via the links.

However, this minor improvement in UX is written off by the poor SEO performance this can generate.

 

How to remove the collection from a product URL in Shopify

You can remove the collection from the Shopify product URL, by following these steps;

1. Access your collections layout design, presumably located at Snippets > product-card-grid.liquid

2. Remove the collections portion of the product URL, but replacing {{ product.url | within: collection }} with {{ product.url }}

Solution/image source: Shopify

 

Collections folder removed from URL

The collections folder will now be removed from the product URL giving you a cleaner Shopify site.

The primary URL being linked to is essential for SEO, so this should provide good value once patched up.

Cumulative Layout Shift (CLS) – Causes, Testing & Fixes

Cumulative Layout Shift (CLS) – Causes, Testing & Fixes

CLS (Cumulative Layout Shift) is one of the fun ones inside Core Web Vitals (CWV), and can sometimes be a bit annoying to find exact causes.

Or even exact locations it happens on, as Google can be a little light on the examples sometimes.

What is Cumulative Layout Shift (CLS)?

Cumulative Layout Shift is a score assigned to a page based upon how much the page changes/moves around between the initial render and the final load.

It’s about assets loading in, and then whilst the page continues loading, others load in and push the original content around.

Long story short, it’s when a website will annoying shift just as you’re about to click a button, and then you end up clicking the wrong thing because everything moved.

 

Does CLS affect SEO?

Yes, CLS very much affects SEO.

It is one of the main elements that make up the Core Web Vitals, which Google is now taking into account as a ranking factor.

Yeah, its one of hundreds of different ranking factors, but when you’re talking about an apples-to-apples comparison with a competitor, I would very much rather know that I have ticked as many boxes as possible to help me rank a site.

 

Identifying that you have CLS issues

Your first point of call to check for CLS issues, or where you might have spotted them initially, would be in Google Search Console.

On the “Core Web Vitals” tab in GSC, you’ll see some pretty charts that show how you’re doing across your URLs, for both desktop and mobile.

If you then click through to one of the reports, you’ll get a list of the issues that make up the Yellow or Red lines.

If one of them looks like the one below, you’ve got CLS issues;

Clicking on this, Google might give you a couple of URLs examples. Chances are though, it’ll just be a single one, even for hundreds or thousands of URLs.

Google might be saying they’re “Similar” pages, but sometimes they will group completely separate page types in here so don’t fall for their trap.

Now that you’ve identified you’ve got an issue, you need to actually find the root causes of this.

 

Isolating specific CLS issues by testing CLS

There are a couple of ways of isolating the CLS issues so that you can make a specific request with developers for a patch.

If you just go to them saying “fix CLS” they’ll either go in circles or call you crazy because “everything works fine”.

 

Testing CLS with GTMetrix

The first method I use is with GTMetrix. A super quick test, and it’s normally something I am running anyway, so can give a good initial overview.

Run your test, and then you’ll get a web vitals to score like the below;

CLS will flag on the right. For this one, green is fine, but it’s enough to use as this example.

This score will probably be different to what Google is flagging, but it’s not about the actual score. It’s about what’s causing that score, so that you can isolate and patch.

If you go to the ‘structure’ tab, you can then expand the ‘avoid large layout shifts’ section, and GTMetrix will break down the main causes for your CLS score.

GTMetrics flags the main offender here, which contributes 99% of the CLS issue.

Funnily enough, this test was run on a webpage talking about CLS here as I was looking for an example site. Definitely a case of, “do what I say and not what I do”. The post is still worth a read though.

In saying that, we can break down this CLS further by just loading the page.

Click that page above, and see if anything loads and then shifts around.

If your internet is fast, you might not notice it.

I used the GTMetrix video reports, so that I can show devs step-by-step what is happening in the load, and help them troubleshoot.

They are loading in the entire content, which pauses for 0.2 of a second, and then loads the image. This image load pushes all the content down.

Google is seeing this massive shift, and would be assigning it a rather high CLS score because of it.

Super easy to fix though!

 

Testing CLS with the CLS Checker Chrome extension

Firstly, just download the CLS Checker Chrome extension from here.

Using the same site as the GTMetrix test, you just need to load the page, then click the extension and click ‘check the page’.

It flags 2, with the first one possibly being related to the bigger one anyway;

 

If you click on toggle, it will make everything white, red and green.

White hasn’t moved, red is the original location, and green is the new location.

Sometimes a few different CLS issues will get grouped together here, so just be careful that a score the tool flags isn’t an aggregate view of about 3-4 different issues.

So this chrome extension is flagging that something has happened in that big red zone, which has pushed all the content down.

Safe to assume what has caused this based on the screenshot, but combine this with the GTMetrics video and you can really drill into what’s going on with CLS.

 

Testing CLS tester with Webvitals.Dev

This one is a tool I discovered recently, and whilst it’s useful to add to the mix it’s not really anything that GTMetrics or the chrome extension don’t cover.

It does bits from both of them, in the single tool though, so might be useful if you’re after a more consolidated view that you can send directly to the development team.

They do include a cool little GIF of what’s moving around on mobile though! Worth checking out, as it might be what you’re after.

 

Testing CLS with WebPageTest

If your preference is webpagetest, then you can also check CLS issues in there.

Once you’ve run the test, click on the “view” drop-down and navigate to ‘web vitals’.

 

You can then scroll down and you will see the CLS issues.

Here you can also view a filmstrip, but more importantly, also view the video that’s now included.

The same as how gtmetrix helps with the video, webpagetest can now help you with your CLS issues with their video report too.

 

How to fix cumulative layout shift

You normally need a dev team to fix CLS issues, so if that’s not the answer you’re expecting – this is awkward…

In saying that, CLS issues are pretty easy to get fixed, once you’ve identified them.

All the developers need to do is make sure that elements don’t move. They get them to not move, but ensure they load in their final position.

This means they need to set fixed heights for elements that have delayed loading, like images.

If an image is going to be 300px high, then make sure there’s a blank space of 300px (plus padding) to fix that image when it loads.

This ensures nothing will move when the image finally loads in.

Steps to fixing CLS issues

  1. Break down every individual item that you think is attributing to the CLS score based on the CLS testing above
  2. Show the issues to developers, along with with some pretty pictures, so they better understand
  3. Politely ask them to fix it
  4. Deliver cake in the hope it speeds up delivery

 

It really is that simple.

Your CLS issues should now be gone!

Injecting HTML Code “Server-Side” with Cloudflare

Injecting HTML Code “Server-Side” with Cloudflare

I’ve got an older site with a good amount of link value I want to add some links to for some newer sites.

Problem is, the site is on 4 year old tech, and hasn’t been updated in 3 years.

Dev is long gone. If I try run a build, there’s a 90% I’ll break it completely.

I’ve since rebuilt the tech, but can’t move the old site over because…. bloat.

Running Cloudflare, I quickly looked into options and might have found a solution!

 

Injecting HTML with the Add HTML Cloudflare App

This was the initial solution I found, and I was super excited. Took 2 minutes, and I had a link on the old site!

However, the darn thing was client-side.

If that suits you though, this is how you can inject HTML with the ‘Add HTML‘ Cloudflare app.

 

1. Load up a Cloudflare website, and click on ‘Apps’ in the sidebar menu

2. Search for ‘add html’ and click the app

 

3. Click ‘preview on your site’ to load a preview

 

4. Select ‘pick a location’ and a little selection editor will load. If you want to inject into the head, you can just enter head.

 

5. Click on your website preview where you’d like to inject the code. I’m selecting the first post in the list.

 

6. Select from the dropdown where specifically you’d like the code inject. I want to inject the code before this first post, so I am going to select the before option.

7. (OPTIONAL) Select the location you’d like to use this, if not the homepage. Manually add the URL, then reload the list. You might need to do this a few times until it shows, but eventually you can then find the URL and press the little tick on the left.

 

8. Enter the HTML code you’d like injected, and it will display in the preview on the right.

9. Click install and boom, you’ve got some HTML magically injected.

 

As mentioned before though, it’s unfortunately client-side code, and won’t load with JS disabled or show anything in the HTML source of the page :(

 

Injecting HTML code with Cloudflare Workers

This took me a bit longer than it should have to work out, because my coding skills are lacking a bit.

However, from the limited examples of direct use of this, I managed to piece together what I needed.

A Cloudflare worker basically is just a little script, that runs on Cloudflare between your server and your end-user. Cloudflare grabs the page from the server, executes your script, then sends it off to the user. Super powerful, when you know how it works.

Hopefully, this simple breakdown of actually using a worker to inject HTML can help you out too!

 

1. Go to workers in the Cloudflare dashboard, and click ‘create a service’

 

2. Insert a useful name, click ‘HTTP Handler’ and click ‘create service’

 

 

3. Click on ‘quick edit’ in the top right corner

 

4. Navigate to the page you’d like to edit, and then paste the following code;

 

class ElementHandler {
element(element) {
element.append(`*HTML TO INJECT*`, {html: true});
console.log(“injected”);
}
}
async function handleRequest(req) {
const res = await fetch(req)
return new HTMLRewriter().on(“*CSS SELECTOR*”, new ElementHandler()).transform(res)
return res;
}

addEventListener(“fetch”, (event) => {
event.respondWith(
handleRequest(event.request).catch(
(err) => new Response(err.stack, { status: 500 })
)
);
});

* You might need to edit the quotes due to WordPress formatting. Sorry!

 

5. Edit the *HTML TO INJECT* variable with what you’d like to inject. This could be plain text, or any full HTML.

 

6. Find your selector, and at modify the *CSS SELECTOR* variable. To get your selector, right click anywhere on your page, and click ‘inspect element’. Then right-click the element, copy, then click ‘selector’. You just paste this into the *CSS SELECTOR* box and you’re good to go.

 

7. The preview should now update to inject your HTML into the spot specified. There could be some issues with this, but provided you’ve follow the steps it should put it exactly where you specified. This is a bit newer to me, so comment with any issues and I can try help out!

8. Click ‘save and deploy’ to get the worker saved

 

 

9. Go back to the cloudflare website you’d like this added to, and navigate to workers again, then click on ‘add route’.

 

10. In the modal that pops up, edit the location you’d like the worker run at, along with select the worker, and then just select production, and click save. I only want this on the homepage, so I’ll modify the screenshot to remove the wildcards (*).

 

11. The route should now be loaded in the account, with the worker selected.

 

 

12. Go test the site! It should be live on the pages you selected, and it should load server-side!

 

You can confirm it’s server-side by viewing the page source, and then just searching for the code you added.

Perfect. Just what I wanted!

Just make sure you’re putting the worker in the right spot, if its not a unique piece of code you’re adding.

If a div is available across the site, and you set your path in the route to be the whole site, then that piece of code will show up everywhere.

Great if that’s what you want, not so great if its not what you want!

 

How does this magical HTML insertion with Cloudflare workers work?

This all uses the Cloudflare HTMLRewriter class, with a heap of documentation here.

There’s essentially an audible woosh as this all flys over my head.

 

The options for the HTMLRewriter class

Whilst I don’t understand most of this, yet, I can tell you the following important options.

Inside the code you have;

element.append(`*HTML TO INJECT*`, {html: true});

Instead of ‘append’, you have a heap of different other options here that you can use.

removeAttribute(name) – Removes the attribute.

before(content, contentOptions) – Inserts content before the element.

after(content, contentOptions) – Inserts content right after the element.

prepend(content, contentOptions) – Inserts content right after the start tag of the element.

append(content, contentOptions) – Inserts content right before the end tag of the element.

replace(content, contentOptions) – Removes the element and inserts content in place of it.

setInnerContent(content, contentOptions) – Replaces content of the element.

remove() – Removes the element with all its content.

So just tweak this to how you want to use it.

Cheers Cloudflare!

 

Adding Links with Cloudflare

Using the same code, we drop a link wherever you want it!

For this code;

element.append(`*HTML TO INJECT*`, {html: true});

You would want to modify it to be something like;

element.append(`<a href=”https://www.sammyseo.com/”>Best SEO Blog Ever</a>`, {html: true});

And then just find the best location selector you can that will suit. ‘before’, ‘after’, ‘prepend’ or ‘append’ would probably be your best options to get the link inserted.

 

Great way to insert HTML into somewhere you can’t get access to!

Would love to hear if you use it, and if so, how!

Cloudflare Crawler Hints & IndexNow

Cloudflare Crawler Hints & IndexNow

Using Crawler Hints in Google, you can add an additional way to tell search engines about your new URLs and content changes.

Yeah, Google doesn’t support it for now. But you never know! That could change.

 

What is Crawler Hints in Cloudflare?

Crawler hints is an inbuilt function in Cloudflare, launched in 2021, that offers a new ability to ping search engines about your new URLs.

Crawler Hints provide high quality data to search engine crawlers on when content has been changed on sites using Cloudflare, allowing them to precisely time their crawling, avoid wasteful crawls, and generally reduce resource consumption of customer origins, crawler infrastructure, and Cloudflare infrastructure in the process

Rather than just constantly pinging about URLs, and wasting server resources, Cloudflare can monitor for actual changes and only tell them when the content is worth re-indexing after a change.

Shortly after launching, Cloudflare enabled IndexNow inside Crawler Hints,

 

What is IndexNow?

IndexNow is a new way to pinging search engines about your content. Much like the old ‘WordPress’ ping system, or even similar to submitting a sitemap, IndexNow pings supported search engines your URLs to help them keep updated.

In its simplest form, IndexNow is a simple ping so that search engines know that a URL and its content has been added, updated, or deleted, allowing search engines to quickly reflect this change in their search results.

It’s nothing super-advanced, but any bit helps when it comes pinging search engines your new URLs.

 

How to use IndexNow on Cloudflare

  1. Go to Caching > Configuration in the sidebar;

cloudflare configuration menu

2. Enable tickbox under Crawler Hints section

enabling crawler hints

It’s that easy.

 

Is IndexNow Worth Using?

Well, it takes 2 minutes to turn on with Cloudflare and its free. So yes, IndexNow is worth using.

Does IndexNow actually make a difference?

That’s what we’re yet to find out. But even if its only Bing and some other search engines using it, its free so its worth just throwing on. You never know.

Does Google Support IndexNow?

No, Google does not currently support the IndexNow protocols. However, Google has shown interest so they may support it in the future.

 

So, for 2 minutes of work you can hopefully improve your sites indexation in Tier 2 search engines.