π₯SERP Clustering
Last updated
Last updated
SEO Utils will cluster your keywords by scraping the search engine results page (SERP). If two keywords share more than four pages in search engine results (similar results on SERP), it will group them into a cluster.
Of course, you can adjust the similar results number from 3 to 7; the larger the number, the more relevant keywords are in a cluster.
The advantage of this method is that it leverages Google's ability to discern users' search intent. If Google shows many similar pages for two keywords, it indicates that Google recognizes these keywords as having the same search intent.
The only drawback is that it can be time-consuming for large keyword lists, as scraping SERPs is a detailed and cumbersome task.
There are two main processes when the SERP Clustering tool runs:
Scraping SERP data for all keywords.
Running the clustering algorithm after scrapping all SERP data.
To get started, head to the SERP Clustering in the left sidebar. Then, click the Cluster Keywords button.
A modal will be opened; it should look like this. We will go over each setting one by one.
SERP Clustering tool that lets you upload multiple files. It automatically combines the keywords and removes any duplicates. No need to mess with Excel anymore. This will save you some time.
After uploading files, you need to map the columns
Keyword column: This column is required.
Search Volume column: This is optional. You can map your external search volume, so you don't have to re-check the search volume for your keywords.
To map columns, click on the Column dropdown.
SEO Utils also automap columns if your files have a guessable column name.
Updated: Since version 1.6.0, you can also map CPC and search intents columns.
Select the location that you want to target and the language of your keyword lists. SEO Utils uses this field for checking keyword search volume in most cases.
This field allows for more precise geotargeting when scraping SERP. You can type a specific location like a city, country, etc.
When using this field, SEO Utils will ignore the location that you selected from the Location / Language field.
SEO Utils will check search results for desktop devices. If you want to check for mobile devices, please disable this setting.
SEO Utils will use DataForSEO to check keyword metrics. This helps determine the primary keyword for a cluster, so it's recommended to enable this setting.
However, if your uploaded files already have search volume data and you mapped the Volume or CPC column, you can turn off this setting to save money.
Tip: If your uploaded files don't have enough search volume data for all keywords, you can turn on the "Check Search Volume" field. SEO Utils only check the search volume for keywords that are missing data.
When setting the primary keyword for clusters, SEO Utils will use the keyword that has the most search volume in a cluster, however, some users have mentioned that setting the primary keyword for a cluster based on Cost Per Click (CPC) provides better results when your keywords are transactional or commercial. You can now select the Cluster Strategy to use CPC instead.
The value range is from 3 to 7. For example, if you set this field to 4, SEO Utils will group 2 keywords into a cluster if they have at least 4 common URLs on SERPs. The larger the number, the more relevant keywords are in a cluster.
Tip: The recommended value is 4, but you are free to test a different value by using the "Re-cluster keywords" button.
Enter the number of days you wish to use previously saved SERP data. By setting this duration, you can speed up the clustering process and save money as SEO Utils will reuse existing data without the need to re-scrape.
Choose a number that best balances freshness of data with processing speed.
Set '0' to always scrape fresh SERP data.
Tip: This field is a great method for testing Similar Results on SERPs value since you don't have to re-scrape SERP data for all keywords.
There are 3 options for now:
My IP: SEO Utils use your IP to scrape SERP data. This is not recommended if you have over 100 keywords.
Proxies: Use proxies to scrape SERP data. See how to set up a proxy here.
SERP API: DataForSEO: Use SERP API from DataForSEO to scrape SERP data.
We will focus on Proxies & SERP API, so you can determine which method is best fit for you.
SERP API Method
DataForSEO provides a reliable API to scrape SERP data with 3 modes:
Since the Live mode is expensive, so SEO Utils doesn't support it. After selecting the SERP API method, you will able to select the mode in the Priority field.
I recommend using the normal execution priority (Standard Queue) option. I have tested to cluster 2,000 keywords, and it only took about 12 minutes to finish all processes. See the video here.
It only costs $0.6 to scrape SERP & cluster 1,000 keywords. This rate is much cheaper when compared to other tools on the market, which usually cost $7-$12 for 1,000 keywords.
Yes, you read that right! It's 11 times cheaperβοΈ.
In v1.21.0, I added a cost estimation tool for the SERP API, allowing you to see the actual cost after deducting keywords that use saved SERP data. Please click here to learn more about it.
Important:
To use the SERP API, you must have your own DataForSEO account. Renting API key services isn't viable because DataForSEO restricts certain endpoints that I utilized to implement the Queue mode. If multiple users rely on a rented API key from my account, it will slow down the process for everyone. For the quickest results, using your own DataForSEO account is the best approach.
Proxies Method
If you need to cluster 1 million keywords per month, using the SERP API Method would indeed be costly, amounting to $600.
This is where the Proxies method becomes valuable. By using your own proxies, you can significantly reduce the costs associated with such large-scale keyword clustering.
You can pay about $80-$100 per month for a rotating residential proxy and you can cluster millions of keywords.
However, this method has more fields to set up.
Workers
This is the number of concurrent requests SEO Utils will scrape SERP. You can enter from 1-50. If you have a good proxy provider, you can set it to 50 (my setting), so SEO Utils can send 50 requests in one second to scrape SERP data. This will increase the clustering process a lot.
If you are unsure about your proxy quality, go with 5-10 workers first.
Request Delay
Enter the number of seconds you wish to have between each request to scrape SERP data for keywords. The more delay time you set, the slower the process will be, but it will help you avoid being blocked by Google.
Set '0' to scrape SERP data without any delay.
For example, if you set Workers to 10 and Request Delay to 1 second. SEO Utils will send 10 requests and then sleep for one second before sending another 10 requests.
Back-off Time
Enter the number of seconds you wish to wait before retrying the failed scraping request. SEO Utils retries the request up to 3 times before skipping a keyword. The more back-off time you set, the slower the process will be, but it will help you avoid being blocked by Google.
For example, if you set Workers to 10, Request Delay to 1 second, and Back-off Time to 2 seconds. SEO Utils will send 10 requests and then sleep for one second before sending another 10 requests. However, while processing, if one request in 10 requests is failed, it will sleep for 2 seconds (back-off time) before retrying the failed request.
SEO Utils will retry a maximum of three times before skipping scraping a keyword. That keyword will end up with no SERP data.
If you set the Back-off Time to '0', SEO Utils will use the default back-off time:
1 second for the 1st attempt
2 seconds for the 2nd attempt.
3 seconds for the 3rd attempt.
Feel free to experiment with different settings for your proxies and usage. When testing these settings, as long as you don't encounter a "Too Many Requests" error message, it should indicate that your setup is functioning correctly.
Since SEO Utils automatically retries each request up to three times, you might see a message like:
Attempt 1 failed to scrape SERP for keyword 'keyword': failed to visit URL: Too Many Requests.
Don't worry if you see this message. SEO Utils will try two more times, so everything is still fine unless you see a failure message on the third attempt.
SERP API | Proxies |
---|---|
Use when clustering thousands of keywords per month | Use when clustering millions of keywords per month |
Less configuration and testing | More configuration and testing |
After clustering keywords, you can visit the report and re-cluster keywords.
You might need to re-cluster keywords in the following situations:
When testing the Similar Results field on SERPs field.
If you're adding more keywords to a report and need to cluster them with existing ones.
To monitor the clusters before and after Google Core Updates to observe any impact.
If some keywords are missing SERP data because some SERP scraping requests were failed.
SEO Utils will show this message if some keywords in your report are missing SERP data, so you know to re-run the clustering process.
When re-clustering keywords, make sure you use the "Use Saved SERP Data For X Days" field. This ensures that SEO Utils won't re-scrape SERP data, saving time and resources.
When re-clustering keywords, you'll be able to distribute new and non-clustered keywords into existing clusters, rather than rebuilding all clusters from scratch.
This is particularly useful if you've already used clusters to create articles on your website and you just want to add new keywords to existing articles (clusters).
A "New" badge will highlight these newly added keywords, and a "Run #" label will indicate which clustering run they were added in.
SEO Utils' clustering algorithm is designed to handle large-scale keyword datasets efficiently. You can cluster between 50,000 to 100,000 keywords at onceβa capability not many tools on the market offer. Most other tools typically allow clustering of only 5,000 to 10,000 keywords at a time.
After clustering your keywords, you can analyze the SERPs of the clustered keywords to see how your competitors are performing. This analysis can show you their monthly traffic, the keywords they rank for, and how many backlinks they have.
You can choose the type of metric and specify the number of URLs on the SERP for each keyword you want to run the analysis for.
After analysis, you can click on a keyword to view the SERP data with all metrics including traffic, backlinks, referring domains, spam score, keywords, etc for that keyword.
The analyzed data is also included in the exported file under the second tab labeled "SERP Data."
You can add a target domain to a SERP Clustering report. This will show you the ranking data of that domain across all clusters, helping you identify which clusters you are ranking well in and which ones need improvement. This feature allows you to prioritize your efforts to boost your rankings effectively.
When viewing the SERP of a keyword, the target domain will be marked in green color.
Tip: You can also change the target domain to a competitor's domain to see how well they are performing across different clusters. This can help you identify their strengths and areas where you might have an advantage.
SEO Utils also provides some metric cards to show you the cluster's key ranking data at a glance:
View and filter the number of ranking clusters.
View and filter the number of clusters without rankings.
See and filter the rank distribution across clusters.
You can watch the video below to see the "Ranking Data" feature in action.
When using the SERP API to scrape SERP data, you can click the βEstimate Costβ button to view the actual cost. SEO Utils will read all keywords in your uploaded file, remove duplicates, and exclude keywords that already have saved SERP data, providing you with an accurate final cost.
In the screenshot below, I uploaded over 18,000 keywords and used SERP data from the past 7 days. The cost came to just $0.0390, saving me $11.2470! SEO Utils only sent 65 keywords to the SERP API for new data, as most of the keywords already had recent SERP data within 7 days.