URL Parameters handling for SEO

What issue is this page dealing with?
URL parameters can be an overlooked aspect of SEO, causing issues with URL bloating and impacting crawler bandwidth. Learn how to identify and fix URL parameter issues through SEO audits and handle them effectively using robots.txt rules.

Want a free audit?

Send me an email with your URL and I’ll spin up a top level SEO audit, free of charge. If you want a slightly more advanced one; just add me to your Google Search Console (email is the same). 

What I cover here

When it comes to URL parameters, in my experience it often requires an experienced SEO to carry out effective diagnostics. Parameters by nature often imply advanced function of the site, this could be session ids, filtering or triage. As URL parameters can often go unchecked by the average business owner I often find a lot of opportunities with a robust audit, especially if the CMS is not a mainstream platform or if you have set up WordPress without SEO features (see my WordPress SEO consultant services). 

URL parameters can also cause secondary issues, which can be quite serious, but URL bloating – impacts crawler bandwidth. There are some sites I’ve worked on that have over 500,000 URLs with parameters, that have had no remedial work on them, causing untold damage to their organic rankings and visibility. Some platforms will have URL parameter handling built-in, for example in WordPress changing the URL handling to permalink rather than the hostname can help with the URL structure. 

How do I identify URL parameter issues?

There are a few options here but given the nature of these issues, it really can pay to commission an SEO expert to carry out an audit. The first place I’d recommend looking at is carrying out an SEO crawl, using software such as Screaming Frog, SEMrush or SiteBulb. The tool should highlight excessive URLs if parameters are rife on the site, for example, if you have a small site but the crawler has picked up, say 10k+ URLs, all with parameters. You can often filter or export the URLs – identify ones with ? or & in there. 

The second method is to look into the Google Search Console > Coverage area, here you can see if Google is having a hard time processing a lot of URLs, you may see a lot of parameter URLs in Crawled or Discovered – Not Indexed. You can also do something similar with Bing Webmaster Tools. A final step could be to go into Settings > Crawl data within Search Console, this will highlight how many URLs are being crawled per day, if there are spikes completely out of the normal then it could be due to parameter handling.

How do you fix URL parameter issues for SEO?

The hardest part of fixing URL parameter issues is often diagnosing what the parameter is actually doing. If it’s something like a session ID or perhaps looking to triage someone if they are logged in, it can be difficult to unravel the functionality. For example, if you have a triage URL for login-in users available on every page, it can cause a substantially high error occurrence rate on the URL crawler. The fixes around these types of errors can vary from site to site, if functionality fixes are not available then perhaps look at handling some bespoke rules into robots.txt file.

How do you prevent URL parameter errors?

The first port of call is to assess if the page holds relatively unique and valuable content, for example on an eCommerce brokerage page you may have a product range and a brand filter, which appends ?brands=2 onto the URL. It may be a beneficial fix to incorporate /site/brandname/ rather than a parameter, which can hold better SEO value with it being a word rather than an ID. So always focus on URLs that have value to be served as static URLs rather than parameter or dynamically changing content.

Additional notes on URL parameters and their impact on SEO

  1. I typically avoid using nofollow on parameters, and the HTML tag, instead I try and fix these through the robots.txt file, this can ensure there is a fallback site-wide, whereas it’s much easier to forget adding an HTML tag. 
  2. Take the necessary steps to unravel bad functionality, always. Excessive URL parameters can be symptomatic of a badly structured and coded website. 
  3. Parameters can have great value to user experience, filtering for example (if the filtered option is not holding search volume), try and find the balance of SEO and UX.

Need CMS specific SEO support?  I’ve got extensive experience working with the below CMS – view each service page to learn more. I’ve also include some additional links which may be useful for that particular CMS.

Would you like a free SEO audit? 

Enter your details in the form here and I’ll run a top level audit of your website, it shouldn’t take long and will give you some basic information on your site’s health. 

In addition to getting a free audit, if you are stuck on a particular problem, do feel free to reach out to me – I’m always happy to help fellow SEO experts answer any questions.