Category: Tools

The Top Programmatic SEO Tools

The Top Programmatic SEO Tools

Programmatic SEO has started to kick off a bit, with more and more people jumping on the bandwagon.

This is leading to more and more tools being created specifically for programmatic SEO.

So what are the tops tools at the moment? What should you use for your new project?


What I look for in a tool

Analysing the top tools, one of the main things I’ll be looking through is what more advanced features the tools offer, verse just using a Google Sheet to hot-swap in different dynamic variables.

Each to their own for this, and there are many that may disagree, but a good programmatic SEO tool with allow some sort of rules that go above and beyond a basic substitution formula in Google Sheets.

Yes, mass page creators have, and still will, make many people good money.

But, a true programmatic SEO tool should allow you to go above and beyond. Offering some sort of rule system that takes the data points into account and outputs different text based on their values.

I’ll also dig into what options are available for data upload & formatting,



Launched in 2022 by Allison Seboldt, PageFactory is a programmatic SEO-specific tool that allows for bulk post creation from your data set.

Let’s dive in and take a look at PageFactory, and see whether this is the pSEO tool for you.



The overall process is super simple to run through. Just a few settings, your data upload, and then your content template and you’re ready to export.



You’ll upload a CSV file of all the data points you’d like included, and will then need to manually enter all the column titles of the data set so that they’re usable.

Obviously can get annoying after a while for larger data sets, with an auto-column extraction & assignment being an easy win here.


Content Templating

The PageFactory template system will output one piece of content per row of data. You’ll write out the content template you’d like to use, and then just insert the variables with a little selector from a list you set up in the settings.

The templates run on a per-row basis, meaning a content piece gets generated for each row.



PageFactory currently integrates with both WordPress and Webflow, so it’s covering a couple of the top blogging platforms people are using at the moment. It does appear she’s working on a Shopify integration at the moment, however, may be facing some logistical issues with how Shopify integrations work, so you’ll have to stay tuned to find out about that one.

There is also an alternative of downloading the content, rather than leveraging an integration. This would allow you to bulk import the content into whatever CMS it is you’re using.



Overall, PageFactory is great for a person just jumping into programmatic SEO who wants to play around with creating some content.

This would be perfect for the smaller pSEO projects, or projects where every row has every data point available.

It’s missing some more advanced rules for content creation, but Alison is still working on the tool so no doubt more advanced options will be added in the future.



Typemat offers a basic programmatic page creator, where you paste in your data set and then create templates from that.



Typemat gives an easy process of adding data, creating templates, and then posting.



You just select all your data, and then literally paste it into the Typemat interface, rather than upload anything.

This is a nice quick option for importing data, but certainly wouldn’t see this working well with larger datasets though.


Content Templating

The templating system is a bit basic here, with a no-rule style addition of the data points.



Typemat will only work with WordPress.

Whilst the majority of builds will work with this anyway, it probably won’t be the solution if you’re looking at building with anything else.



The Typemat videos recommend you to set up your content sections in Google Sheets, possibly with AI if you want, and then upload that data. So it’s more of a CMS to handle the bulk posting, and the piecing together of the content, rather than a ‘raw data to final product’ solution.

Easy to kick a project off, but you may find alternatives more preferred that let you have a bit more control from within the software.



SEOmatic was launched in October 2022 by Minh Pham and offers both their own CMS you can connect a custom domain too (great option!) or integrations with many of the top CMSs.

Is SEOmatic going to be the programmatic SEO tool of choice for you? Let’s take a look!



The page creation process is a little more in-depth, and may take a little bit longer as there are a few more settings than the other tools.

Considering the integration options that are available though, its a pretty simple process to get some pages live via the tool. Both directly, and via an integration.



Importing your CSV dataset will auto-import all the columns, and then allow you to select whether you’re adding it to the data or whether you’ll ignore the data point.


Content Templating

You can create both an excerpt and full-page content, allowing you to specify the summary separately.

There are some advanced IF and IF/ELSE rules available. I’ll have to dig in a bit further to try these out, so will update once I have had the chance to do so.

Spintax is accepted, however, it’s using square brackets instead of the normal ones, {}, that are used with spintax. So you’ll need to reformat your text if you’ve got it in proper spintax already

SEOmatic is also currently testing AI prompting from within the content templating. This will let you use your variables for the prompts, so once you’ve nailed the prompts you can scale generation from within the tool! Great feature.



Many. They basically offer integration with anything you’re likely to use, with the current selection including; WordPress, Webflow, Shopify, Notion, Bubble, Prestashop, Framer, Wix, Typedream,,,

The integration options feature auto-publishing to the platforms, so basically it’s just – create content, publish pages.



If you’re looking for a few more advanced programmatic SEO page creation features, SEOmatic may be the tool you’re after.


Google Sheets

Google Sheets will always be a go-to for me, as I completely control the input, transformation, and output.

I put together a template, and some basic instructions, on how to dynamically generate content in Google sheets here.

This will let you do the standard dynamic replacement in a dataset by just typing out the content template, inserting your variables where required, and then copying the outputs wherever you need them.


The best programmatic SEO tool

There are a few options, but the best tool is going to come down to what will work best for you.

Do you just need a few dynamic variables switched out, and then the bulk content exported?

Do you need something with more options?

Do you want auto-posting to WordPresss or a similar CMS?


Stay tuned, as this list is still a work in progress. I’ll also be getting more hands-on with some of the tools and trying to build some sample sites out and see what we can build with them.

Subdomain to Subfolder: The Simple Cloudflare Reverse Proxy

Subdomain to Subfolder: The Simple Cloudflare Reverse Proxy

For years, there has been a subdomain verse subfolder debate.

One side says this, another side says that.

People have run their own tests, shown that moving a subdomain to a subfolder can improve a blogs ranking, yet so many have been in denial because Google has said the opposite.

Well, Aleyda Solis came out with direct test results, and a lot more people have finally jumped on the subfolder bandwagon.

Most of the time though, running a blog off a subfolder instead of a subdomain isn’t technically feasible.

It’s a pain to set up.

It was a shocking process the last time I tried to do it with a larger site 4/5 years ago.

However, lucky for us things have changed over the last couple of years.

If you use Cloudflare, you can now how you can have a blog installed on a subdomain, yet force it to load, and make it looks like it exists, under a subfolder.


Aleyda’s test

Kicking it off here, this is the tweet that stirred things up again.

Clearly highlighting the tech challenges that she’s gone through to get it going, Aleyda shows there’s a definitive growth following the migration to a subfolder.

An almost instant improvement.

There are plenty of caveats that could exist here, as it’s SEO after all…. but once the migration was complete you can see a nice upwards trend.

In some projects like this, I have seen what I dub a ‘new shiny’ effect. New URLs getting a little boost when discovered/migrated too, and then sometimes a bit of a drop off afterwards.

I reached out for an update to see if there was such a dip, and Aleyda was kind enough to provide a new graph;

No post-launch dip! Nice.


Reverse proxies

Before we get into the different methods of setting this up, you need to understand reverse proxies.

Well, the super basics of them anyway…. which is where my knowledge of them stops anyway lol.

A reverse proxy allows you to essentially mask your website’s true file location, and load it somewhere else.

CloudFlare, in general, acts as a reverse proxy by being a CDN. It masks your server’s true location, by forwarding the requests from a Cloudflare ‘middle-man’ to it.

But this runs as URL in, URL out, where the URL is the same before and after the request and only the IP of the content is modified.

We can tweak and override this a bit, so that the URL of the request, differs from the URL of the content location.

So you could have a blog installed on the subdomain, and leave it there, but make it act and look like it’s actually installed on a subfolder.

A request will come through to and then the reverse proxy will grab the content available at, and then load it as if it actually exists at

It allows you to bypass the biggest tech issue of running a blog on a subfolder, which is managing multiple different technologies installed in the same location.

Well, something along those lines anyway.


Subdomain to Subfolder with Nginx

Two previous blog moves from subdomain to subfolder that I have helped with, involved nginx reverse proxying.

Nginx is a server technology that does server things, but one thing it does is route traffic. You can give it filters or rules, and tell it to send specific requests one way, or another.

It’s like a little middleman that can move your site’s traffic around.

Using this, you tell it to reverse proxy your subfolder requests, to your subdomain, and have it look like everything loads under the subfolder.

Nginx has mostly been a more enterprise-level setup, so there’s a good chance you might not be using it.

If you have Nginx installed, here is a detailed guide on using it as a reverse proxy.


Subdomain to subfolder with Apache/.htaccesss

Similar to Nginx, Apache is another server tech that does similar things.

In particular, their htaccess file allows you to set this sort of thing up.

If you’re on a typical web hosting setup, this is the most likely setup you’ve got going.

To jump into a subfolder migration with the .htaccess, you can find a detailed guide here.


Subdomain to subfolder with Cloudflare

Now, this is my new favourite.


Because I can do it without any tech involvement, and it is independent of any other server tech.

And in under 2 minutes! Pretty sweet.

No messing around, it’s magical.

Many will also run Cloudflare before Nginx / Apache is hit, so it will work across both and be a bit more flexible.

Today, I will show you how you can do it too.


How to setup a reverse proxy with Cloudflare

It first started with a guide I found from here.

There were a few others floating around, but this was the best one I could find to match the specific requirement of reverse proxying content from a subdomain to a subfolder leveraging Cloudflare.

However, it, unfortunately, didn’t work for me. It was close, but the WordPress side of things kept failing.

It took a few goes to work out whether it was the server (Siteground has some fun caching :/) or whether it was the Cloudflare setup.

Modifying the DB directly, some WordPress config scripts, and many other changes, but every time something else would break.

Missing CSS files, bad redirects, and constant server errors. Every time I patched something, something else would break.

I gave up, and called in some dev support.

A dev named Dat came through, and sorted me out.


Steps to setup the reverse proxy

The following instructions will help you get your reverse proxy setup in both Cloudflare, and your WordPress setup.

1. Create the Cloudflare workers
  • Log into your Cloudflare account, but don’t load up any of the sites, and you’ll see the ‘Workers & Pages’ setting option;

  • Jump in here, and click ‘create application’

  • Then find ‘create worker’

  • Name the first one;


Modifying the sitename to be your actual site name. This name can be anything, but including the sitename can help incase you want to do this multiple times, as each worker could be loaded under any domain.

  • Click deploy, and it will load in a default script, which we will replace.
  • Click on ‘edit code’;

  • Delete the default code, and then paste in the first set of code from below, for the ‘reverse proxy worker’.
  • Modify any mention of or to be the settings you require.

Be careful not to modify any existing, or add any new, trailing slashes or https mentions as they will break everything.

  • Click on ‘Save and deploy’ in the top right corner, and then ‘Save and deploy’ again on the little popup modal

  • Repeat the above steps for the redirect worker, named “sitename-redirect-worker”, and get that one deployed too.


2. Setup the Cloudflare routes
  • Open the website you wish to add the routes for, and then find ‘Workers Routes’

  • Click on ‘add route’


  • Create routes for the following two URLs (modifying them to match what you need), by selecting the redirect service worker you created, and ‘production’ environment



  • Create routes for the following two URLs (modifying them to match what you need), by selecting the proxy service worker you created, and ‘production’ environment



  • After both sets of 2 routes have been created, you will see something similar to this on your Workers Routes page;


3. Modify the WordPress Site URL

The easiest step of them all.

Load up the WordPress admin area, and jump into Settings > General.

Modify the Site Address, and not the WordPress address, as per the below settings;


4. Add a trailing slash redirect for the sub-folder

After all this, we couldn’t get a final issue solved in the end, unfortunately. The blog homepage was available with both the trialing slash, and no trailing slash. Just the homepage. Everything else works beautfulllllly.

  • To get this patched up, we load up Cloudflare and head to the Rules > Redirect Rules

  • Click ‘Create rule’ under the single redirects section

  • Create a rule that uses the non-trailing slash blog URL version as the incoming requests rule, and the same URL but with a trailing slash as the URL to redirect these requests to

5. Implement a redirect strategy

If this is an existing build you’re modifying, make sure you implement a full 301 redirect strategy! It should just be a simple 301 rule that forwards from the sub-domain to a sub-folder, but triple check it all.

There’s no point moving to a sub-folder if you break everything along the way.


The scripts


Reverse proxy worker

addEventListener('fetch', event => {
// Skip redirects for WordPress preview posts
if (event.request.url.includes('&preview=true')) { return; }


class AttributeRewriter {
constructor(rewriteParams) {
this.attributeName = rewriteParams.attributeName
this.old_url = rewriteParams.old_url
this.new_url = rewriteParams.new_url

element(element) {
const attribute = element.getAttribute(this.attributeName)
if (attribute && attribute.startsWith(this.old_url)) {
attribute.replace(this.old_url, this.new_url),

const rules = [
from: '',
to: ''
// more rules here

const handleRequest = async req => {
// Redirect WordPress login to the subdomain
let baseUrl = req.url;
if (baseUrl.includes('')) {
return new Response('', { status: 302, headers: { 'Location': baseUrl.replace('', '') } });

const url = new URL(req.url);

let fullurl = + url.pathname;
var newurl = req.url;
var active_rule = { from: '', to: '' } => {
if (fullurl.startsWith(rule.from)) {
let url = req.url;
newurl = url.replace(rule.from,;
active_rule = rule;

const newRequest = new Request(newurl, new Request(req));
const res = await fetch(newRequest);

const rewriter = new HTMLRewriter()
.on('a', new AttributeRewriter({ attributeName: 'href', old_url: active_rule.from, new_url: }))
.on('img', new AttributeRewriter({ attributeName: 'src', old_url: active_rule.from, new_url: }))
.on('link', new AttributeRewriter({ attributeName: 'href', old_url: active_rule.from, new_url: }))
.on('script', new AttributeRewriter({ attributeName: 'src', old_url: active_rule.from, new_url: }))
// .on('*', new AttributeRewriter({ attributeName: 'anytext', old_url: active_rule.from, new_url: }))

if (newurl.indexOf('.js') !== -1 || newurl.indexOf('.xml') !== -1) {
return res;
} else {
return rewriter.transform(res);



Redirect worker

const base = ""
const statusCode = 301

async function handleRequest(request) {
const excludedPaths = ['/wp-login.php', '/wp-admin', '/wp-admin/']
const url = new URL(request.url)
const { pathname, search, hash } = url
const destinationURL = base + pathname + search + hash

if (excludedPaths.some(path => pathname.startsWith(path))) {
return fetch(request)
} else {
return Response.redirect(destinationURL, statusCode)

addEventListener("fetch", async event => {


Your subdomain to subfolder setup should be live

Following the steps above, your blog should now be publicly accessible under the sub-folder, and the admin panel will be accessible under the original sub-domain.

You should be able to directly load the new subfolder, and it’ll work as if you’re accessing the subdomain where it is actually installed.

I’d love to hear how it goes if you went ahead with the change! Both whether the instruction above completely worked for you, and how performance was if it was an existing build you modified.

Dynamic Content Generation with Google Sheets

Dynamic Content Generation with Google Sheets

If you haven’t been following along, I’ve been documenting my journey to a 1,000+ page programmatic site in WordPress using Google sheets.

The build is heavily reliant on Google Sheets to generate dynamic content based on templates.

The content there is a little more advanced, as it is leveraging rules to determine which content template is used.

I thought I’d take it back to basics, and create a dynamic content generator that was a bit simpler.

A straight-up data-to-content text replacement.


Dynamically generating text in Google Sheets


The data set

The generator starts with your data set.

Each column represents a different piece of data.

You wouldn’t need to use every single of pieces of data you put in the spreadsheet, however, you will need to ensure any data point you do want to use is there.

Each of the columns is a variable name, and that will be taken by the generator and replaced out.


The dynamic text template

On the second sheet, is a cell where you write out your text template.

You just write out the text you’d like to include, in the format you want it, and include the variables wherever you’d like them.

Next to the text template is a list of all your variables.

These are just a list of every header name from the first sheet, and it’s a great list to help you remember what you have to work with, rather than needing to flick back and forth between your data set.

Since it’s formula driven, you can’t copy/paste the variables. However, if you’re not adding more columns to the dataset you could paste the raw data so that you could just copy them in.


The content generation formula

You’ll find the actual formula that does the replacement on the main generator sheet.

The formula might look a little daunting, but it’s just a large nested substitute.

Each heading has “<” and “>” added on either side to convert it to something to use in the text templates.

The formula will then take these variables, and substitute them for the value in that column.

It will then repeat the process.

To add a new variable in;

1. Insert a column before the contentOutput and fill in your data.

2. Add an additional SUBSITUTE( at the front of the list

3. Copy the data after the 2nd to last bracket, and paste it after the last bracket

4. Modify the cell references to instead reference your new column

So if you inserted one into the current template, the formula would go from;

=SUBSTITUTE(SUBSTITUTE( SUBSTITUTE(SUBSTITUTE( SUBSTITUTE(SUBSTITUTE( SUBSTITUTE(Template!$A$2, “<“&$C$1&”>”,$C2), “<“&$D$1&”>”,$D2) ,”<“&$E$1&”>”,$E2), “<“&$F$1&”>”,$F2), “<“&$G$1&”>”,$G2), “<“&$H$1&”>”,$H2), “<“&$I$1&”>”,$I2)

To become;

=SUBSTITUTE(SUBSTITUTE( SUBSTITUTE(SUBSTITUTE( SUBSTITUTE(SUBSTITUTE( SUBSTITUTE(SUBSTITUTE( Template!$A$2, “<“&$C$1&”>”, $C2),”<“&$D$1&”>”,$D2), “<“&$E$1&”>”,$E2), “<“&$F$1&”>”,$F2), “<“&$G$1&”>”,$G2), “<“&$H$1&”>”,$H2), “<“&$I$1&”>”,$I2) ,”<“&$J$1&”>”,$J2)

Note the blue for the new modifications.

Since the new column of data was J, we copied the end and then changed the I to a J.


Content template preview

A preview column has also been added to the text template tab.

Once your generation formula is set up with all your variables, you will be able to see a live generation example as you edit your template.

This will allow you to see exactly what the text template looks like when you substitute some of your real data into it.

The example will randomly select from all of your examples every time you update the cell.

A great way to ensure you’re accounting for different outputs in your template, as you craft it.


Access the Google Sheet

You can access the dynamic content generation Google Sheet at the below link.

Let me know if you have any questions!

Feel free to leave a comment, I’d be happy to help out.


Using Data to Determine What Filters Should be Targeted

Using Data to Determine What Filters Should be Targeted

Bedrooms, and bathrooms, and price ranges, oh my!

There are so many filters that could be used in the pretty URL, but what should be used?

What should earn the “pretty” status, and what should be relegated to query parameter status?

Some may say everything goes in pretty.

Some may just pick a few, and leave it at that.

Well, let’s take a look at how you could use data to inform your decision.

I’m not going to go into why you shouldn’t be creating pretty URLs for everything, that’s a separate post.

What we will run through are ways to set up keyword data, for you to gain insights about search trends of the filters you’ve got, so that you know what you should be optimising for.

The data you’ll need

To run this analysis, you’ll ideally just need an extremely large keyword research piece specifically for your niche.

It will need to be rather large to ensure you can get solid data, and you’ll also need to have it pretty clean, or at least understand the caveats of your data.

If you’ve also got a tonne of GSC data to mix in, then that would be great. That will help extend the data to cover a good portion of the 0-volume keywords that might not show up in keyword research tools.

For my examples below, I just spent 20 minutes pulling together some real estate keywords for 3 Australian cities, and ~15 suburbs from a “Top Suburbs in Melbourne” type list.

I wanted to limit it to location-specific keywords, as without a giant list of seed locations it can be hard to compare location v not location without significant time spent. Makes things messy.


Setting up your data

Whether you’re using Excel or Google sheets, you’ll need to create a data table to power the analysis.

The data table should contain all the keywords, their volumes, and associated categorisation.


Create categories for each filter

You’ll need to create categories for each of the filters that you are trying to analyse and work out whether they should be optimised for.

Go through and create columns, and set upYou’ll  their categorisation with rules for each possible value so that you can capture as many as possible.

For my example, I am using real estate keywords. A column has been created for each filter I’d like to analyse, along with categorisation for each of them.

Each filter has its seed values assigned, along with what the value is for that variable.

If a keyword contains either ‘buy’ or ‘sale’, it gets flagged as “Buy”.

If a keyword contains ‘1 bed’ or ‘1br’ it gets flagged as 1 Bedroom.

You can read more about how this works here.

You’ll want to be as thorough here as possible, and include as many variations as possible.

A couple of missed ones could really sway a decision.

Try and also create a catchall category at the end of filters with only a variable or two.

I created one for ‘features’ based on what I was seeing in the keyword data.


Prefiltering / cleansing the data

Depending on how clean your keyword data is, it might be better to just look at a portion of it.

A portion you know is 90% cleaner than the rest of the data.

For my real estate keywords, I know that if the keyword includes a channel, so ‘sale’, ‘buy’, ‘rent’, or ‘rental’, there is a higher chance of it being a keyword of enough quality for the study.

To include keywords that don’t include a channel (like ‘real estate’ or ‘properties’), I also include keywords including a bedroom or bathroom filter.

This is done via a YES/NO filter, that just flags it as YES if any of the filter cells have something inside them.

All my data analysis will have this filter applied, and it brings the keywords down from 9,000 to just 2,000.

I know those 2,000 are trustworthy to tell a story.


Creating your pivot tables

You’ll now need to create pivot tables for each of them so that you have a way to read the data

Go through a create a pivot table for each of your filters with the below data;

  • Filter as the row
  • Search volume SUM as value
  • Search volume COUNT as value
  • Search volume COUNT shown as % of grand total as value

The SUM should be obvious, being that it will be the total amount of search volume for each of the filter values.

The COUNT will be how many times that filter value is used among the keyword set.

The COUNT & % of Total will show us the actual % of keywords that use this filter value. A little quicker to analyse than the overall count alone.


Analysing and selecting your filters

Now we’ll get to read our data and see what we can make of it.

Let’s take a look at my property-type keywords.

We can see that of the 2,000 keywords included, 85% mention a property type. So only 15% are more generic keywords like ‘properties’ or ‘real estate’.

Even if you consider ‘homes’ as generic, that’s still less than a quarter of the keywords without a property type.

So yes, property type 100% needs to be optimised for.


Looking at the features keywords.

Only 2 keywords include pets, 2 with pools, and then 1 mentioning granny flat. If these were the only filter values available, I would not be optimising for them.

Similar story with the bathrooms keywords.

Only 2 keywords contain a bathroom-related phrase. Probably wouldn’t recommend targeting that in bulk.

Now onto the 2 that are a bit contentious when it comes to real estate sites.

The first one being bedrooms.

Bedrooms is one I personally recommend against optimising for directly under normal circumstances. At least at first anyway.

I feel it creates too many variations of URLs, with not enough reward/value in return for doing so. Can be worth targeting once all indexation/crawling boxes are ticked, especially with some rules in place, but maybe not directly out the gate.

In saying that, looking at the data 10% of the keywords (7% of total volume) include a bedroom value.

Is that enough to warrant targeting of it? Maybe.

But if we break that data down a bit further, and split out the city (Melbourne) from the ~15 suburbs, we see something a bit different.

16% of the city keywords (14% of volume) contain a bedroom term, verse only 5% (1% of volume) of the suburbs do.

So that’s 1 location have a significantly larger amount of keywords including it than the 15 other locations combined.

So if you create pages equally amongst cities & suburbs, you’re going to be creating significant volumes of pages when only a small portion of them will be useful.

Yeah, long-tail value this and that. I’m not saying definitely don’t, I’m just advising against it without restrictions in place.

A similar situation is with the prices.

Pretty low volume for the majority of the keywords that include a price (normally ‘under xxx’ type keywords).

And if we break it into city vs suburb, we get;

None of the suburb keywords in this data include a price. It’s only at the city level.


Why some filters may not be worth targeting

I’m a big believer in limiting crawlable URLs where possible.

Minimising re-use of listing content, avoiding the possibility of confusing Google – too much.

Keeping the site as small as possible, whilst still targeting as much as possible.

So why would I recommend avoiding creating bedrooms or pricing optimised URLs in bulk?

Well, it comes down to page count.

Crawlable page count to be specific.

Let’s say you have a location set of 5,000 locations.

10 property types.

and 2 channels.

You’ve already got 100,000 crawlable URLs right there.

If you then have 7 bedroom options, you’re looking at 700,000 URLs in addition, to that 100,000 that exist, that Googlebot will have to constantly trawl through.

Is it worth enlarging your site by 700% to target an extra 7% in search volume?

If you think so, then go for it.

That’s also if you do it cleanly. If you have other filters with crawlable links on the site, that overall crawlable URL count will only increase.

So if you’re creating significant page volumes off of smaller % filters like this bedrooms count, you must ensure you have your crawling well in check before you launch.

That way you can avoid exacerbating any existing issues.

There are other ways of efficiently targeting these types of keywords though.

In particular, I recommend a targeting strategy here on how to target these filters that may have value at key locations, and not others, by having a couple of tiers of locations.


Picking your filter values

To try and keep some filters in check too, you can also optimise the system so that only certain values of a filter get optimised.

Using the bedrooms as an example, you might choose to just create pretty URLs for Studios, 1 bed, 2 bed, and 3 bedroom apartments. 4+ bedrooms would then be relegated to the query parameter, and not receive the internal links pointing into it.


Let the data guide your optimisation

By leveraging this keyword data you can really gain an insight into what filters, and values, you should be optimising for.

Plenty of caveats, particularly around longer tail keywords that tools won’t give you, but there should be more than enough data to at least guide an initial decision.

It’s also easier to expose a filter later on, than to clean up the over-indexation caused by one if it needs to be reverted.

There’s also the other question here, is it even worth putting in the work to have separate ‘pretty’ and ‘parametered’ filters?

I’ll leave it to you to decide.

Bulk Download Images from URLs – Convert & Rename

Bulk Download Images from URLs – Convert & Rename

Batch downloading images, and renaming them, isn’t something you’ll use often, but when you do, doing it in bulk can really save you quite a bit of time.

Why would you want to do it?

Well, plenty of reasons. From migrating images, wanting to bulk optimise their file names, or downloading imagery from brands for products you sell.

Using the attached Excel file, you can insert all the new image names, along with their current source URLs, and the macro will download all the images, give them the new names, and also save them as .jpg files no matter their source extension.

How to batch download & rename images with Excel

The following is the process on how to download images from url in excel and rename them;

1. Download and open the Excel image URL downloader

2. Click on Developer > Macros and then hit edit on the selected one in the file


3. Edit the folder path that is highlighted, with it needing to point to a folder that currently exists. This is where the newly renamed images will be downloaded too. If you point it to a folder that doesn’t exist, it won’t visually create the folder. However, if you create the folder after the fact, all the files will be there. So, yeah.


4. Insert all the new image names in, ensuring the new name includes dashes, and doesn’t include a file extension


5. Click on ‘macros’ and then click run on the selected macro


You’ll now get ‘File Downloaded Successfully’ on all the files that have downloaded from the websites okay.


You’ll see an error if it doesn’t work, with most errors I have seen attributed to being blocked by the source you’re scraping. Depending on the use, you can ask for your IP to be unlocked, but sometimes it’s just quicker to throw on a VPN.

Another issue I have seen is that if you include an extension in the name, you will end up with double image extensions like .jpg .jpg. You can modify the script by dropping the .jpg extension that’s included if you really want, but it’s just easier to exclude the extension names.

Download the Excel Image Downloader
  You will receive a download link via email. We hate spam and never share your details.

Successful bulk image download from URL & Rename

This URL image downloader should have now successfully downloaded and renamed all the images you will need, extremely fast.

Bulk Redirecting URLs in Cloudflare

Bulk Redirecting URLs in Cloudflare

For years, Cloudflare has had the page rules section where you can setup redirect rules. But you’re only allowed 3 page rules per domain on a free account, which severely limits the redirect options so many don’t use it.

However, at the end of 2021 Cloudflare released its new bulk redirect setup.

These extend the Cloudflare redirect functionality, and offer more flexibility with one-to-one redirects, rather than just a handful of rules.


What is the Cloudflare bulk redirect feature?

The bulk setup allows for the creation of redirect lists, where you can have significant quantities of one-to-one redirects setup.

These aren’t redirect rules, they’re literal one URL, to another URL redirects.

They’ve mentioned the following in their launch post however;

There is still a requirement for supporting more complex, rich URL redirects, e.g., device-specific URL redirects, country-specific URL redirects, URL redirects that allow regular expressions in their target URL, and so forth. We aspire to offer a full range of functionality to support as many use cases as possible.

So hopefully that means either this functionality, or additional, will be launched to support more bulk managed redirect rules.


How to bulk redirect URLs inside Cloudflare?

This is how you create a set of bulk redirects in Cloudflare.

1. Log into cloudflare, and on the overall account management screen click on ‘Bulk Redirects’ in the left menu

2. Create a new redirect list first, so click the text link and then click ‘create new list’ on the new page that loads

3. Enter a name & description, and then make sure you click on ‘redirect’ content type and not the default IP selection.

4. You’ll come to a config page, where you can click ‘add items’

5. Enter all your one-to-one redirects manually, but entering the source and the target URL. You can also just upload a single CSV that has these two columns, and import that directly into cloudflare. Keep in mind that HTTP and https URL versions will be treated separately, so either add both variations or have a separate page rule handle the HTTP > https portion.

6. Once you created your list, go back to the bulk redirects page, and click on ‘create bulk redirects’ button

7. Add a name, and select the new redirect list you just created at the bottom of the page, and click ‘save and deploy’.


Once that magical save and deploy button is hit, your redirects should pretty much be live. Go and test them at or a similar tool to ensure you’re not cached, and you should see the redirects live within a minute or two.


How many bulk redirects can be set up?

Cloudflare offers completely free accounts with just the 20 bulk redirects, however, if you have just a single paying property (20/month) you get access to the 500 across all sites as these are at the account level.

Definitely worth having a single account upgraded for these alone, considering how many you can have, and how easy they get set up.


Should you be using this new Cloudflare feature?

It really depends on what you’re trying to achieve, and whether the quantities allowed fit into what you need.

If you have a dev team, and a full server setup, then yeah this won’t be what you want.

If you’re just moving URLs around on an existing site, then probably not either.

But if you’re deprecating one site, and merging it into another, where you don’t want to have any hosting maintained for the old domain, this is exactly the product you need.

Caching HTML in Cloudflare for Improved Page Load Speeds

Caching HTML in Cloudflare for Improved Page Load Speeds

Caching HTML can help you reduce the overall page load speed, as the CDN can hold your HTML rather than have to request it from the server each time.

There are pros and cons to doing this, but if you’re a static site that doesn’t change too often, caching your HTML could be the perfect solution to improving page load speeds a bit.


Does Cloudflare cache HTML?

Yes, however not by default. You need to enable it with the below steps to create a page rule.


How to cache HTML with Cloudflare

Caching HTML in Cloudflare is easy, so just follow these steps.

1. Open cloudflare, and click ‘page rules’ in the sidebar

2. Click ‘create page rule’ to the right of this page

3.  Insert your sites domain with a wild card on the end, just like the cloudflare example, and then select ‘Cache Level’ and ‘Cache Everything’ from the dropdowns.

And you’re done!

You should be able to run a test now, and compare to before HTML caching.

For me, here is my test.

Before HTML caching

After HTML caching

Yes, the site is already optimised, but a few key metrics were improved even further.

FCP dropped from 305ms to 112ms.

TTI dropped from 305ms to 112ms.

LCP dropped from 500ms to 444ms.

Not too bad at all!


Is it good or bad to cache HTML with Cloudflare?

It really depends. How static is your content?

If you have an extremely dynamic site, or even a news homepage style site, caching HTML might not be good for you.

Rather than a consumer constantly getting fresh content, they will get the HTML cache version until the cache is refreshed. This means they’ll only see the content available at the time of cache, and not anything that has been added or edited since.

Not just consumers, but Google too. We want Google to have the freshest content, so you might be holding yourself back by a day or two, along with annoying users, if you enable HTML caching.

However, if you’re site is extremely static, HTML caching can really help you knock off a few hundred milliseconds off some key core web vital numbers.


Use HTML caching wisely, and it can help you improve your page load speeds… provided you’re a good fit for it.

GTMetrix Review: Top SEO Insights You’ll Get

GTMetrix Review: Top SEO Insights You’ll Get

One tool I will use every single time that I audit a website is GTMetrix.

I’ve seen comments about it, and that other speed testers like provide “better data” and “more insights”, but GTMetrix does everything I want and helps me solve my problems.

As with every tool, you take its automated insights with a grain of salt. You leverage them, to guide further insight gathering, or to back up a specific decision.

GTMetrix gives me the data to pass on to dev teams, and help get issues patched.


Running a Speed Test with GTMetrix

It’s pretty simple to run a speed test in GTMetric.

  1. Enter the URL you’d like to test.
  2. Change the location you’d like the speed test run from. Highly recommend you get this as close to your audience as possible.
  3. (OPTIONAL) Change the browser you’d like to test from. This is where you can also select a mobile browser if you’d like to run a mobile speed test.
  4. (OPTIONAL) Select the speed if you’d like to throttle the speed test. Throttling can help show more “true to life” bottle necks, like with a poor mobile connection, but will also help smaller issues show up easier as everything will get exacerbated.
  5. (PRO REQUIRED) Ensure you have the video test flicked on, if you’re a pro user. You’ll get a bit more useful insight.
  6. Click Analyze, and run your website speed test.

How to run a speed test with GTMetrix


How to interpret GTMetrix’s waterfall chart

The waterfall chart breaks down the exact points at when different resources are called, connected to, and downloaded.

Each resource is ordered based on when its loading starts.

All you have to ever really worry about here is when a specific resource is connected to, and when it finishes being downloaded. There are very few use cases you’ll run into as an SEO where anything in between is required.

GTMetrix waterfall chart example

Look through what items are being loaded and when, and then run through the standard checks from here to optimise.

Large files being loaded? Are specific requests taking too long? Too many files? External requests you didn’t know about?

Plenty of things to analyse here, but they’re very specific to each audit.

Each significant request stage is broken down by a coloured line. The following is what stage each of the coloured lines in the waterfall chart represent;

GTMetrix waterfall chart legend

You can find some more info on the waterfall chart from GTMetrix directly, here.


Page Load Video by GTMetrix

One of my favourite features, particular to help better identify CLS issues,

Unfortunately, it’s a pro-only feature.

In saying that, it’s worth it to help out with these audits.

When running your original test, you can tick on video audit.

Or, when viewing an audit you can click ‘enable video and re-test’ and GTMetrix will re-run the test, including the video test this time.

Video of pagespeed test

Here’s an example video output, from the test I ran above of

You can play/pause the video, or run it at 1x, 1/2x, or 1/4x speed.


Testing Core Web Vitals with GTMetrix

When you run a GTMetrix test, you get a basic overview of your core vitals.

Core web vitals testing on GTMetrix

You can extract a little more information from the waterfall chart, and a few other places in GTMetrix, but this overview can help you delve into each specific CWV separately.


GTMetrix Alternatives

So, if this isn’t the tool for you, what other options are there?

The top few that come to mind are;


Is GTMetrix pro worth it?

Yes, I believe GTMetrix pro is worth it. Well, for me anyway.

If you have an alternate speed tester you’re using, then it probably wouldn’t be.

Other tools have similar features, I have just used GTMetrix for years now, so have gotten used to it and understand what to look at a bit more than the other tools.

Removing the Collection from a Product URL in Shopify

Removing the Collection from a Product URL in Shopify

I was doing a quick audit of a friend’s website recently, and discovered something rather annoying about the site.

On a collection page, Shopify added the slug of the current collection, to all the product links on that page.

So every time a product was listed in a different collection, it would have a different link pointing in.

Each of these links correctly pointed to the primary product URL with the canonical tag, but that tag doesn’t work like it used to.

Google is also assigning weight to each of the links coming into the page.

This caused a product URL that contained the collection, to compete with the actual product URL.

The URL from a collection page was:<collection>/products/<product-slug>

Yet the actual product URL was;<product-slug>


Why does Shopify add the collection or a product URL?

There are a couple potential reasons, with the main stemming from the user experience.

When the collection is added on the URL, breadcrumbs can get updated to point back to that collection.

If a user enters from 2 different collections, they’d see two different breadcrumbs, and then be able to go back to their original ‘search’ via the links.

However, this minor improvement in UX is written off by the poor SEO performance this can generate.


How to remove the collection from a product URL in Shopify

You can remove the collection from the Shopify product URL, by following these steps;

1. Access your collections layout design, presumably located at Snippets > product-card-grid.liquid

2. Remove the collections portion of the product URL, but replacing {{ product.url | within: collection }} with {{ product.url }}

Solution/image source: Shopify


Collections folder removed from URL

The collections folder will now be removed from the product URL giving you a cleaner Shopify site.

The primary URL being linked to is essential for SEO, so this should provide good value once patched up.

Cumulative Layout Shift (CLS) – Causes, Testing & Fixes

Cumulative Layout Shift (CLS) – Causes, Testing & Fixes

CLS (Cumulative Layout Shift) is one of the fun ones inside Core Web Vitals (CWV), and can sometimes be a bit annoying to find exact causes.

Or even exact locations it happens on, as Google can be a little light on the examples sometimes.

What is Cumulative Layout Shift (CLS)?

Cumulative Layout Shift is a score assigned to a page based upon how much the page changes/moves around between the initial render and the final load.

It’s about assets loading in, and then whilst the page continues loading, others load in and push the original content around.

Long story short, it’s when a website will annoying shift just as you’re about to click a button, and then you end up clicking the wrong thing because everything moved.


Does CLS affect SEO?

Yes, CLS very much affects SEO.

It is one of the main elements that make up the Core Web Vitals, which Google is now taking into account as a ranking factor.

Yeah, its one of hundreds of different ranking factors, but when you’re talking about an apples-to-apples comparison with a competitor, I would very much rather know that I have ticked as many boxes as possible to help me rank a site.


Identifying that you have CLS issues

Your first point of call to check for CLS issues, or where you might have spotted them initially, would be in Google Search Console.

On the “Core Web Vitals” tab in GSC, you’ll see some pretty charts that show how you’re doing across your URLs, for both desktop and mobile.

If you then click through to one of the reports, you’ll get a list of the issues that make up the Yellow or Red lines.

If one of them looks like the one below, you’ve got CLS issues;

Clicking on this, Google might give you a couple of URLs examples. Chances are though, it’ll just be a single one, even for hundreds or thousands of URLs.

Google might be saying they’re “Similar” pages, but sometimes they will group completely separate page types in here so don’t fall for their trap.

Now that you’ve identified you’ve got an issue, you need to actually find the root causes of this.


Isolating specific CLS issues by testing CLS

There are a couple of ways of isolating the CLS issues so that you can make a specific request with developers for a patch.

If you just go to them saying “fix CLS” they’ll either go in circles or call you crazy because “everything works fine”.


Testing CLS with GTMetrix

The first method I use is with GTMetrix. A super quick test, and it’s normally something I am running anyway, so can give a good initial overview.

Run your test, and then you’ll get a web vitals to score like the below;

CLS will flag on the right. For this one, green is fine, but it’s enough to use as this example.

This score will probably be different to what Google is flagging, but it’s not about the actual score. It’s about what’s causing that score, so that you can isolate and patch.

If you go to the ‘structure’ tab, you can then expand the ‘avoid large layout shifts’ section, and GTMetrix will break down the main causes for your CLS score.

GTMetrics flags the main offender here, which contributes 99% of the CLS issue.

Funnily enough, this test was run on a webpage talking about CLS here as I was looking for an example site. Definitely a case of, “do what I say and not what I do”. The post is still worth a read though.

In saying that, we can break down this CLS further by just loading the page.

Click that page above, and see if anything loads and then shifts around.

If your internet is fast, you might not notice it.

I used the GTMetrix video reports, so that I can show devs step-by-step what is happening in the load, and help them troubleshoot.

They are loading in the entire content, which pauses for 0.2 of a second, and then loads the image. This image load pushes all the content down.

Google is seeing this massive shift, and would be assigning it a rather high CLS score because of it.

Super easy to fix though!


Testing CLS with the CLS Checker Chrome extension

Firstly, just download the CLS Checker Chrome extension from here.

Using the same site as the GTMetrix test, you just need to load the page, then click the extension and click ‘check the page’.

It flags 2, with the first one possibly being related to the bigger one anyway;


If you click on toggle, it will make everything white, red and green.

White hasn’t moved, red is the original location, and green is the new location.

Sometimes a few different CLS issues will get grouped together here, so just be careful that a score the tool flags isn’t an aggregate view of about 3-4 different issues.

So this chrome extension is flagging that something has happened in that big red zone, which has pushed all the content down.

Safe to assume what has caused this based on the screenshot, but combine this with the GTMetrics video and you can really drill into what’s going on with CLS.


Testing CLS tester with Webvitals.Dev

This one is a tool I discovered recently, and whilst it’s useful to add to the mix it’s not really anything that GTMetrics or the chrome extension don’t cover.

It does bits from both of them, in the single tool though, so might be useful if you’re after a more consolidated view that you can send directly to the development team.

They do include a cool little GIF of what’s moving around on mobile though! Worth checking out, as it might be what you’re after.


Testing CLS with WebPageTest

If your preference is webpagetest, then you can also check CLS issues in there.

Once you’ve run the test, click on the “view” drop-down and navigate to ‘web vitals’.


You can then scroll down and you will see the CLS issues.

Here you can also view a filmstrip, but more importantly, also view the video that’s now included.

The same as how gtmetrix helps with the video, webpagetest can now help you with your CLS issues with their video report too.


How to fix cumulative layout shift

You normally need a dev team to fix CLS issues, so if that’s not the answer you’re expecting – this is awkward…

In saying that, CLS issues are pretty easy to get fixed, once you’ve identified them.

All the developers need to do is make sure that elements don’t move. They get them to not move, but ensure they load in their final position.

This means they need to set fixed heights for elements that have delayed loading, like images.

If an image is going to be 300px high, then make sure there’s a blank space of 300px (plus padding) to fix that image when it loads.

This ensures nothing will move when the image finally loads in.

Steps to fixing CLS issues

  1. Break down every individual item that you think is attributing to the CLS score based on the CLS testing above
  2. Show the issues to developers, along with with some pretty pictures, so they better understand
  3. Politely ask them to fix it
  4. Deliver cake in the hope it speeds up delivery


It really is that simple.

Your CLS issues should now be gone!