When it comes to URL parameters, in my experience it often requires an experienced SEO to carry out effective diagnostics. Parameters by nature often imply advanced function of the site, this could be session ids, filtering or triage. As URL parameters can often go unchecked by the average business owner I often find a lot of opportunities with a robust audit, especially if the CMS is not a mainstream platform or if you have set up WordPress without SEO features (see my WordPress SEO consultant services).
URL parameters can also cause secondary issues, which can be quite serious, but URL bloating – impacts crawler bandwidth. There are some sites I’ve worked on that have over 500,000 URLs with parameters, that have had no remedial work on them, causing untold damage to their organic rankings and visibility. Some platforms will have URL parameter handling built-in, for example in WordPress changing the URL handling to permalink rather than the hostname can help with the URL structure.
How do I identify URL parameter issues?
There are a few options here but given the nature of these issues, it really can pay to commission an SEO expert to carry out an audit. The first place I’d recommend looking at is carrying out an SEO crawl, using software such as Screaming Frog, SEMrush or SiteBulb. The tool should highlight excessive URLs if parameters are rife on the site, for example, if you have a small site but the crawler has picked up, say 10k+ URLs, all with parameters. You can often filter or export the URLs – identify ones with ? or & in there.
The second method is to look into the Google Search Console > Coverage area, here you can see if Google is having a hard time processing a lot of URLs, you may see a lot of parameter URLs in Crawled or Discovered – Not Indexed. You can also do something similar with Bing Webmaster Tools. A final step could be to go into Settings > Crawl data within Search Console, this will highlight how many URLs are being crawled per day, if there are spikes completely out of the normal then it could be due to parameter handling.How do you fix URL parameter issues for SEO?
How do you prevent URL parameter errors?
Additional notes on URL parameters and their impact on SEO
- I typically avoid using nofollow on parameters, and the HTML tag, instead I try and fix these through the robots.txt file, this can ensure there is a fallback site-wide, whereas it’s much easier to forget adding an HTML tag.
- Take the necessary steps to unravel bad functionality, always. Excessive URL parameters can be symptomatic of a badly structured and coded website.
- Parameters can have great value to user experience, filtering for example (if the filtered option is not holding search volume), try and find the balance of SEO and UX.