Algorithm Update – LinkRisk and LinkValue – “Critchlow”
1st April 2018
Periodically we release major updates to the way we calculate our LinkRisk algorithm. Traditionally these updates get named after someone to make it easier for us to discuss the impact of particular updates over time.
We term this period of update a LinkRisk ‘Dance’ which is a throwback to when Google used to announce ‘Google Dances’ during times when they’d update and rebuild their indexes (Gosh that seems a long long time ago now).
Todays update contains two major steps forward for us and builds the pathways for future updates to our two main scoring metrics, LinkRisk and LinkValue.
And today we are naming this update ‘Critchlow’ after Will Critchlow of Distilled. The updates are usually named after someone in the industry that we feel encapsulates some of the reasoning behind the update, this one is no exception…
Before we dive in, its worth reminding ourselves why we have two metrics: –
LinkRisk – This metric runs from 1-1000 (1 being great, 1000 terrible) and tries to give a guide to users on the ‘intent to manipulate Google’ behind each link. It’s used as a way to gauge the risk of any link contributing to Google applying a manual action and generally forms the framework for anyone trying to insure themselves via the creation and maintaining of a disavow file.
LinkValue – This metric gives a score of 0-5 (0 being no value – 5 being a lot) for each link and is aimed at people who want to understand the relative ‘link equity valuation’ of any of their inbound links. It is aware of things like how often a linking domain gets disavowed and a lot of historical trust signals for all known domains.
Todays two updates
LinkRisk scoring update: Critchlow
This update implements a further method by which our award winning LinkRisk algorithm utilises machine learning principles to take the wisdom of our users to further loop back into the core reasoning of the scoring system to provide more accurate results. From today the training dataset used by our systems to teach the LinkRisk algorithm how to score Good and Low Risk domains has been increased by a factor of 3. The backend methodology on how this training set impacts the way the scores are calculated has also been updated to allow us to now focus that training onto the sites that appear most across our clients profiles. This allows us to make better classifications of the sites that really impact the day to day work of any of your audit work.
Since the launch of our LinkValue we have been working with clients to help use the LinkValue to try to see what the true equity is left in both their own site and their competitors. We wanted a way for users to glimpse behind the curtain that Google has drawn across everything and see the true net value of any site, including the impact of any clean up they might have done.\n\nTodays update takes that a step further and uses our training dataset to enhance the ability of the LinkValue algorithm to decrease the equity value if a site has either been historically highly disavowed or if it meets the criteria for our ‘Penguin Proxy’ to ignore some of its value.
It also further strengthens the link between the machine elements of both scoring systems.
These two updates represent a significant leap forward in our ability to use the machine training data to further enhance our two metrics and make them even more accurate. There is a lot of complicated maths involved, Will… you would be proud.
To get the updated LinkRisk and LinkValue scores you will need to rescan your profiles, if that causes you some headaches with credits for the month please feel free to ask us directly and we will always see if we can help with the scan.
As always we like to end these things by breaking out of the technical detail and poke some fun at the person we named the update after… apologies Will… we love you really…. This is as close to dancing as we could make….