Bedrooms, and bathrooms, and price ranges, oh my!
There are so many filters that could be used in the pretty URL, but what should be used?
What should earn the “pretty” status, and what should be relegated to query parameter status?
Some may say everything goes in pretty.
Some may just pick a few, and leave it at that.
Well, let’s take a look at how you could use data to inform your decision.
I’m not going to go into why you shouldn’t be creating pretty URLs for everything, that’s a separate post.
What we will run through are ways to set up keyword data, for you to gain insights about search trends of the filters you’ve got, so that you know what you should be optimising for.
The data you’ll need
To run this analysis, you’ll ideally just need an extremely large keyword research piece specifically for your niche.
It will need to be rather large to ensure you can get solid data, and you’ll also need to have it pretty clean, or at least understand the caveats of your data.
If you’ve also got a tonne of GSC data to mix in, then that would be great. That will help extend the data to cover a good portion of the 0-volume keywords that might not show up in keyword research tools.
For my examples below, I just spent 20 minutes pulling together some real estate keywords for 3 Australian cities, and ~15 suburbs from a “Top Suburbs in Melbourne” type list.
I wanted to limit it to location-specific keywords, as without a giant list of seed locations it can be hard to compare location v not location without significant time spent. Makes things messy.
Setting up your data
Whether you’re using Excel or Google sheets, you’ll need to create a data table to power the analysis.
The data table should contain all the keywords, their volumes, and associated categorisation.
Create categories for each filter
You’ll need to create categories for each of the filters that you are trying to analyse and work out whether they should be optimised for.
Go through and create columns, and set upYou’ll their categorisation with rules for each possible value so that you can capture as many as possible.
For my example, I am using real estate keywords. A column has been created for each filter I’d like to analyse, along with categorisation for each of them.
Each filter has its seed values assigned, along with what the value is for that variable.
If a keyword contains either ‘buy’ or ‘sale’, it gets flagged as “Buy”.
If a keyword contains ‘1 bed’ or ‘1br’ it gets flagged as 1 Bedroom.
You can read more about how this works here.
You’ll want to be as thorough here as possible, and include as many variations as possible.
A couple of missed ones could really sway a decision.
Try and also create a catchall category at the end of filters with only a variable or two.
I created one for ‘features’ based on what I was seeing in the keyword data.
Prefiltering / cleansing the data
Depending on how clean your keyword data is, it might be better to just look at a portion of it.
A portion you know is 90% cleaner than the rest of the data.
For my real estate keywords, I know that if the keyword includes a channel, so ‘sale’, ‘buy’, ‘rent’, or ‘rental’, there is a higher chance of it being a keyword of enough quality for the study.
To include keywords that don’t include a channel (like ‘real estate’ or ‘properties’), I also include keywords including a bedroom or bathroom filter.
This is done via a YES/NO filter, that just flags it as YES if any of the filter cells have something inside them.
All my data analysis will have this filter applied, and it brings the keywords down from 9,000 to just 2,000.
I know those 2,000 are trustworthy to tell a story.
Creating your pivot tables
You’ll now need to create pivot tables for each of them so that you have a way to read the data
Go through a create a pivot table for each of your filters with the below data;
- Filter as the row
- Search volume SUM as value
- Search volume COUNT as value
- Search volume COUNT shown as % of grand total as value
The SUM should be obvious, being that it will be the total amount of search volume for each of the filter values.
The COUNT will be how many times that filter value is used among the keyword set.
The COUNT & % of Total will show us the actual % of keywords that use this filter value. A little quicker to analyse than the overall count alone.
Analysing and selecting your filters
Now we’ll get to read our data and see what we can make of it.
Let’s take a look at my property-type keywords.
We can see that of the 2,000 keywords included, 85% mention a property type. So only 15% are more generic keywords like ‘properties’ or ‘real estate’.
Even if you consider ‘homes’ as generic, that’s still less than a quarter of the keywords without a property type.
So yes, property type 100% needs to be optimised for.
Looking at the features keywords.
Only 2 keywords include pets, 2 with pools, and then 1 mentioning granny flat. If these were the only filter values available, I would not be optimising for them.
Similar story with the bathrooms keywords.
Only 2 keywords contain a bathroom-related phrase. Probably wouldn’t recommend targeting that in bulk.
Now onto the 2 that are a bit contentious when it comes to real estate sites.
The first one being bedrooms.
Bedrooms is one I personally recommend against optimising for directly under normal circumstances. At least at first anyway.
I feel it creates too many variations of URLs, with not enough reward/value in return for doing so. Can be worth targeting once all indexation/crawling boxes are ticked, especially with some rules in place, but maybe not directly out the gate.
In saying that, looking at the data 10% of the keywords (7% of total volume) include a bedroom value.
Is that enough to warrant targeting of it? Maybe.
But if we break that data down a bit further, and split out the city (Melbourne) from the ~15 suburbs, we see something a bit different.
16% of the city keywords (14% of volume) contain a bedroom term, verse only 5% (1% of volume) of the suburbs do.
So that’s 1 location have a significantly larger amount of keywords including it than the 15 other locations combined.
So if you create pages equally amongst cities & suburbs, you’re going to be creating significant volumes of pages when only a small portion of them will be useful.
Yeah, long-tail value this and that. I’m not saying definitely don’t, I’m just advising against it without restrictions in place.
A similar situation is with the prices.
Pretty low volume for the majority of the keywords that include a price (normally ‘under xxx’ type keywords).
And if we break it into city vs suburb, we get;
None of the suburb keywords in this data include a price. It’s only at the city level.
Why some filters may not be worth targeting
I’m a big believer in limiting crawlable URLs where possible.
Minimising re-use of listing content, avoiding the possibility of confusing Google – too much.
Keeping the site as small as possible, whilst still targeting as much as possible.
So why would I recommend avoiding creating bedrooms or pricing optimised URLs in bulk?
Well, it comes down to page count.
Crawlable page count to be specific.
Let’s say you have a location set of 5,000 locations.
10 property types.
and 2 channels.
You’ve already got 100,000 crawlable URLs right there.
If you then have 7 bedroom options, you’re looking at 700,000 URLs in addition, to that 100,000 that exist, that Googlebot will have to constantly trawl through.
Is it worth enlarging your site by 700% to target an extra 7% in search volume?
If you think so, then go for it.
That’s also if you do it cleanly. If you have other filters with crawlable links on the site, that overall crawlable URL count will only increase.
So if you’re creating significant page volumes off of smaller % filters like this bedrooms count, you must ensure you have your crawling well in check before you launch.
That way you can avoid exacerbating any existing issues.
There are other ways of efficiently targeting these types of keywords though.
In particular, I recommend a targeting strategy here on how to target these filters that may have value at key locations, and not others, by having a couple of tiers of locations.
Picking your filter values
To try and keep some filters in check too, you can also optimise the system so that only certain values of a filter get optimised.
Using the bedrooms as an example, you might choose to just create pretty URLs for Studios, 1 bed, 2 bed, and 3 bedroom apartments. 4+ bedrooms would then be relegated to the query parameter, and not receive the internal links pointing into it.
Let the data guide your optimisation
By leveraging this keyword data you can really gain an insight into what filters, and values, you should be optimising for.
Plenty of caveats, particularly around longer tail keywords that tools won’t give you, but there should be more than enough data to at least guide an initial decision.
It’s also easier to expose a filter later on, than to clean up the over-indexation caused by one if it needs to be reverted.
There’s also the other question here, is it even worth putting in the work to have separate ‘pretty’ and ‘parametered’ filters?
I’ll leave it to you to decide.