Duplicate content is often the red flag most business owners and webmasters know of; likely driven by SEOs. Here’s the issue with it, Google hasn’t issued any penalties to websites for duplicate content directly but it does have an indirect impact on your organic visibility. In my experience, the issue is often causing competing pages, also known as canabailisation, meaning multiple pages are too similar in nature and compete for the same terms. Other symptoms include duplicate page titles or duplicate meta descriptions, due to issues in the uniqueness of each of the pages.
The most comprehensive tool for identifying canibilisation, or duplicate content, is SiteBulb in my opinion – with this being a dedicated feature. You can also look at performance in Google Search Console, filter by keyword (often flag ship terms) then toggle the URL tab and see if that keyword is providing impressions to more than one page. I then look to see if the pages receiving impressions are very similar in nature. On larger sites, I do this in a bulk way but for smaller sites with a handful of flagship keywords, I’ll do this manually. You can also look in Google Search Console > Coverage and Excluded, you can often find pages within Discovered – Not Currently Indexed, indicating low quality content and duplication could be a culprit.
Once you’ve identified pages that are very similar in nature you have a couple of options, the first one is to consolidate pages together or to try and breakaway competing pages with separate messaging. You must also underpin this with identifying why the duplicate pages have came about, if it’s allowed to continue, as the site grows and grows this can cause a problem.
In my experience, it can actually be a fault of the SEO consultant managing your site. I’ve seen many content writers be given a brief from an SEO which has a list of keywords and phrases it needs to include, this can often dictate the topic and theme of the blog post, page or guide – over time with different writers and multiple SEOs, content can stack. There is also deliberate attempts to rank for local terms for example which use 80% of the same content on all pages but switch out the location, this is also very rife – ironically in my industry, I see loads of SEO people in Durham but are actually based in Somerset or some other distant place.
Additional comments when it comes to duplicate content
- When it comes to duplicate content, it always pays to consider the current site structure when intoducing new content, if you have average or below-average length pages, is there scope to merge pages together?
- The risk of duplicate content is much higher on larger sites with less content, for example if a target length of a blog post is 300 words (you should work on the topic requirements rather than word count, but I guess this is the real world! ) then you are at higher risk. The risk increases the larger the site gets without diversifying the topics.
- Look at alternative content methods, for example on this site I’ve started introducing videos on the homepage and I’m currently working on infographics – this provides some protection when you diversify with media types.
- Carefully look at what pages go after what keywords, which pages support keywords and ultimately write for the user first over SEO, although do consider SEO in your user-centric content.