Just in case you haven’t heard yet, Yandex, a Russian search engine (the 4th biggest in the world) had some of it’s proprietary source code leaked. A large proportion of that code was 17,854 ranking factors which Yandex either uses now or used in the past to figure out how to rank different pages. This number was initially reported at 1922 but was later revised after deeper explorations of the code.
I’m sure I don’t have to explain why this is being described as the biggest event in SEO in years – it’s the closest we’re going to get to understanding how search engines rank our content. You may be thinking that it’s not worth the bother because Yandex is a tiny search engine compared to Google but here’s why there’s so much buzz – Yandex was started as a simplified Google clone, it has it’s own version of RankBrain called MatrixNet, it uses PageRank and a lot of ex-Google employee’s work at Yandex – it’s not far-fetched that they may have brought some ideas over from Google and implemented it at Yandex.
What Was Revealed In The Leak?
Special thanks to Alex Buraks for his Twitter threads that helped break down the ranking factors into the most relevant ones (Note – a lot of these are marked as unused or deprecated but they’re still a good indication of the sort of signals that search engines may consider). For SEOs focusing their efforts on Google, these are the key takeaways regarding page ranking factors:
- Document age – The older the better albeit if updated regularly.
- Last update date – recently updated pages are better.
- User behaviour – High CTR, low bounce rate and other good signals will boost rankings.
- The traffic that your site receives – The more traffic your site receives the better. The % of which is organic traffic is also considered.
- Average domain positions across all queries – The higher your average position, the better.
- Numbers inside the URL – This has a negative effect on rankings, especially if the numbers are an old date.
- The number of slashes in the URL – The less slashes you have in your URL, the better.
- Crawl depth – the closer your page is to your main page, the more important it is in the search engine’s eyes.
- The amount of errors you have – the less errors, the better so do regular site audits.
- Backlinks from main pages – these are a higher ranking factor than backlinks from internal pages.
- Traffic from Wikipedia – search engines seem to priortise sites like Wikipedia so receiving backlinks from them helps ranking more than a normal backlink.
- Is your URL the last for a user’s session? If so, it shows that the user found what they needed and is a positive ranking factor.
- Amount of users who bookmarked your URL – The more bookmarks the better.
- Keywords in URL – Rather obvious one but make sure to include the exact keyword in your URL. If it’s a long-tail keyword, try to use a maximum of 3 or 4 words as the slug.
- Broken embed videos – these are a negative ranking factor, try to fix them as soon as possible.
What Does This Teach Us About SEO Best Practices?
A lot of the above findings are common SEO best practices but there are a few that may cause a raised eyebrow or two, even from the most experienced SEOs. Here’s a list of the best practices that you should be following according to the Yandex ranking factors:
Update the content regularly
It’s well known that Google prefers fresh content; even better if it’s a page that was published a long time ago but is updated on a regular basis. That would tick off two ranking factors, the age of the URL and the last update date.
But how often should you update your content? It depends but according to individual studies, you should update your content every 4 to 9 months. When updating it, be sure to actually provide more value by correcting errors, adding in supporting data, adding more content or reorganising the layout – do not just rephrase what was already written, you won’t fool Google.
How much should you change? Your aim should be to change at least 10% of the content.
Don’t neglect underperforming pages
SEOs have always been taught to double down on what’s working but we far too often fall into the trap of completely neglecting what isn’t working. If Google does consider the average domain position across all queries as a ranking factor, neglecting poorly performing pages is a fatal mistake and likely why you can’t get your top performers to budge. Either delete the poor performers or revamp them.
Limit number of slashes in URL
This one’s a bit controversial because, at least from what was in the Yandex source code, the less slashes you have in your URL the better, but we’ve always been taught that setting up good navigation through URL and breadcrumbs is key. This however, results in a fair few slashes in the URL. As you can see, this one’s a bit of a toss up.
Personally, I stick to the age old adage – do what’s best for the user. Following this advice, I would definitely continue using parental pages but would consider whether there are any unnecessary steps in the chain.
The following is an example of a URL that needs some optimisation:
/guides/technical-seo-guides/search-console-errors/404-error
This could be shortened down to:
/guides/technical-seo/404-error
Final Thoughts
As stated earlier on, the vast majority of the ranking factors in the leaked Yandex source code were never used or were deprecated, but they still give us a good indication of the sort of things that are considered by search engines when ranking pages. These points should give you a reason to re-evaluate your stance on SEO best practices and how you implement your SEO strategy – these findings have definitely led to some changes in mine.
If you’ve just read this article and thought “wow, there’s a lot of work that I need to do on my site”, why not consider reaching out to see how we can help you boost your SEO performance.