When you are browsing a website you will likely have noticed a secure padlock, or the other spectrum of encryption/security, great big warnings that a site is not secure. When I first started SEO, SSL certificates were an expensive addition to a website, often reserved for a large organisation. SSL certificates are now freely given out to new sites often at a hosting level, which is fantastic but there are some smaller pitfalls some website owners can fall into.
Some common issues I see with websites using incorrect or dated security protocols often fall around the manual error. This includes using HTTP:// resources over https:// which can break the SSL, and also using third party code through tools like Google Tag Manager which break the HTTPS connection. There are also secondary issues whereby sub-domains and www/non-www redirection doesn’t always take place, but I’ll walk you through some tips I use to identify, fix and prevent these types of issues. It can impact SEO dramatically, firstly by rankings due to not being a secure site, but any traffic you do acquire from all channels will likely be put off due to warnings which are now embedded in most major browsers. If you are looking at specific SSL certificate errors then I have written a dedicated SSL error guide.
There are some quick wins here, most SEO tools will identify these with no problem. I will give a special mention to Site Bulb who really does assess all subdomains and gives a rigorous assessment on SSL. The other tools are fine too – be that SEMRush, ahrefs or Screaming Frog. The tool can do a good job at identifying these issues but if you don’t have a subscription, another way is to find a page which has a broken SSL connection and view the source code. Doing a quick find for HTTP:// will often flag up any links that do not adhere to the https policy. You may also need to log into your Google Tag Manager account if you use it, and see if any of the third parties are using the HTTP protocol, perhaps there are some tags in there from many moons ago which need updating?
Typically I just change the source URL to point to HTTPS, this works 99% of the time. Nowadays if a website doesn’t support https then it’s probably not a site you want to be working with or tagging from. There is also a secondary issue you may need to resolve here, the root cause of why these tags are being implemented (or hardcoded links/automated links). You may need to establish a policy with people updating the website to use an https:// protocol as standard – typically if I’m linking to a site, I’ll render the page in my browser and then copy and paste the URL – where you can assure the SSL version works and also there can be little room here for typos.
The first assessment you need to make when adding new tags to the site is if they are using SSL protocols, which I’d say 99% of the sites are. Look for any HTTP:// links in their tag code. You can also relay this up by doing an SEO crawl of the site. If you are adding links to your site regularly, ensure the guys updating the site are rendering the URL in the browser first before copying and pasting; it’s likely they will see any https errors when rendering the page as this is built into most browsers. Lastly, if you run a site-wide SEO crawl on the site, you get the benefit of having older pages assessed too; pages which you may have not looked at for years and may not be subject to the global domain policy or perhaps have shreds of HTTP:// links.
More comments on the old security protocol issues for SEO:
- Sometimes a browser can serve errors for not just old security protocols but hacked sites too – so be sure to differentiate between them if they are shown on your site.
- You can see additional feedback on your SSL links within Search Console > coverage, if there are any using old security protocols you can often find them within the URL list associated with errors. You can also see any warnings or manual action taken on the site in the event that any security issues have been flagged.
- I’ve seen a couple of times that a not secure warning can be triggered on an isolated computer, this does happen and is a problem with the computer rather than the website. A follow up diagnostic is to see if this error is being served for multiple sites.
- Most hosts, DNS and CMS platforms do a good job and steering you in the right direction, if you log in and see several errors on your CMS or haven’t updated plugins in a very long time, it might be worth biting the bullet and getting it commissioned, my experience with sites with dated code base is they are often a ticking time bomb when it comes to getting hacked.
- I’d also encourage getting your site moved over to SSL by a professional developer or someone with experience in this field, I do occasionally see a looping link between https and http – which can break your site on the front end completely.