Ever wanted to scrape just a handful of Google SERPs inside a Google Sheet?
Or wanted to group a bunch of keywords, to better de-duplicate them and isolate unique topics?
Well, have I got a Google Sheet for you!
My SERP Scraper & Keyword Grouping Google Sheet will do all of that for you.
Check out the video below;
In just a couple of clicks, you will get full SERP data for thousands of keywords, and will even get them grouped up by SERP similarity.
Nothing I have seen will give you access to RAW SERP data, so easily & efficiently – let alone process them into the groupings for you!
SERP data is now available & accessible en-masse.
No learning how to run a Python script.
No overpriced, under-delivering SAAS.
Just paste keywords, GET SERPs.
This is what you need to do.
How to scrape Google SERPs inside a Google Sheet
1. Signup for Serper.dev and get an API key
No affiliation with them, they just have a low-cost API with an awesome response time.
2. Duplicate the Google Sheet, and add your API key into the settings
The API key goes into the API key cell, on the settings sheet of the Google Sheet.
This is where you can also edit the Google Location, or the Language, of the SERPs you’ll scrape. Tweak them as needed, or leave as is for a default US search.
Other settings on this tab include disabling the PAA & Related Keyword extraction, which may speed up processing and allow you process more keywords without overloading Google Sheets. Just enter FALSE in either of those cells.
You can also increase the batch size, which is set at 30 due to a 50-per-second query limit for Serper.dev. You should be able to increase this above 50 safely, as its not doing a batch per second, but 30 worked fine and is still fast enough.
3. Add keywords with their search volume
Add the keywords you’d like to scrape the SERPs for.
You only need keywords for the SERPs, but will need volume to group them. It won’t work without it.
4. Click ‘Extract SERPs’
Once the keywords are loaded, just click extract SERPs.
This will then process keywords in the batch size mentioned on the settings page. Serper.dev allows 50 connections a second, so I set it to 30 to keep within the limits. Increase and test if you need it faster!
Each batch will add “SCRAPED” in the status column for the keyword.
You may get a script warning. You just need to click advanced, and then allow/proceed. It may then start running, but if you don’t see anything in the ‘status’ column, it won’t be running.
The warning will look like this:
Ciick Advanced, and then click “Go to SerperScraper”
It will show your email, and not give any external access to the sheet. This script just sends some keywords data externally for the requests, and you can view the entire scraper script yourself.
Note: If the scraping times out, just click extract again. We only get 6(?) minutes to run a script, so if the scraping takes longer than that it’ll freeze up. Pressing extract will just continue from where if left off though. 2,500 keywords should process within the 6(?) minutes, so this should only be an issue with a higher amount of keywords.
How to run keyword grouping based on SERP Similarity
1. Press ‘Group Keywords’ button
Click the button to run the script.
It will then process, and load up the groupings and parents in their columns.
That’s it. You’re done.
OPTIONAL: Adjust SERP Similarity setting
The optional step here is to tighten, or loosen, the SERP similarity setting.
This setting allows you to flag how many URLs you’d like to be in the serps.
0.1 is 1, 0.2 is 2, 0.3 is 3, etc.
A good starting point will be 0.4 or 0.5, but depending on your keyword set, along with what you’d like to get out of the grouping, you could adjust this to 0.2/0.3 to be broader matching, or 0.6/0.7 for even tighter groups.
Broader matching could be for higher level categorisation, or may content hub type pages.
Tighter matching is best for keyword mapping, classification, or just general keyword filtering to remove duplicates/close-duplicates.
Plenty of other uses though!
Download the sheet
Ready to get stuck in?
You can get access to the sheet right here.
Let me know if you have any issues at all, as we only have a small user base at the moment!
Uses for the sheet
Plenty of different use-cases for the sheet, and I will build out some additional how-tos in time.
- Once off SEO market analysis like this one I started
- De-duping keywords based on highly similar terms
- Isolating core content topics from large lists
- Determining word associations like fridge = refrigerator
The list goes on! Let me know your use-case.
Any issues, questions or feedback?
Would love to hear any feedback or answer any questions you have.
If you’ve got an issue, just throw in a comment and we can work through it.
I will provide some deeper-dives and some more actionable information off the back of this shortly, for now though, enjoy the sheet!