Google is now using CTR for Rankings

June 11, 2012 // Search Engines, seo

What with all Google’s various animal related algorithm updates happening along side it’s apparent “wake up” to websites buying links the landscape is always changing. There are often 2 areas that people will associate updates with;

  1. Either a variable will change in Google’s algorithm whereby it effects internal site signals (such as internal anchor text or Meta data values)
  2. Alternatively Google will change something related to external signals (such as social signals or match type for anchor text value)

The recent algorithm changes such as Panda and Penguin have illustrated just that, Panda being the much needed filter to remove low quality content from the index and Penguin the antidote to the mass amounts of low quality links that often appear unnaturally on make-shift blogs. But what of other algorithm behaviours and updates? There are around 500 algorithm changes a year, each targeting different areas such as security, speed, user experience, search results, universal results, Google+, Local and paid listing – however there is one area which doesn’t often get a lot of discussion and that’s the changes that Google makes to rankings based on performance in the SERPS.

As well as the on-site and off-site algorithm factors I believe there are other performance based updates happening which will also contribute to where your website ranks. While the debate of whether or not bounce rates  impact search results has been on-going for many years Matt Cutt’s recently said that the web spam team did not share data – if you believe that or not that’s up to you. The area I’m looking at is CTRs in SERPS and how Google will test your website at various positions and results will help determine final rankings.

The Theory:

For the very broad, generic keywords that often have the highest search volumes you’ll find that most of the time there is actually very little difference between each website – they’ll target the same keywords on landing pages, have similar content strategies (blogs + news), be secure and technically sound and also have great quality links from reputable sources. So how is Google meant to decide who ranks where? Well social signals are more important obviously but even those can be relatively similar in some verticals. On possible answer is that Google is testing how websites perform in high positions and then making an assessment on where in the top 10 any given website should rank.

The data:

I started noticing that one website I worked on developed a pattern in the rankings over a set period of time. This pattern  consisted of the following:

  • Google was keeping the page 1 search results for a top keyword fresh by swapping 1-2 websites from page 2 in replacement of 1-2 websites already featured on page 1 every 3 weeks
  • Once on the first page the websites would immediately move to a top 2-4 position for 3-5 days
  • Next the websites moved to a lower position such as 5-7 for around 10 days
  • The websites then moved to the bottom of page 1
  • The websites were then swapped with 2 new websites from page 2
I’ve been plotting positions for the last few months and you can see since May this has happened pretty consistantly:
The sections below show two cycles in full:
The graphs above are for 1 website, if you cross reference with 2 other websites which have also acted in the same way (for the same keyword) you’ll notice the exact trend:
Each one of these 3 websites makes the jump from page 2 to page 1 for a similar amount of time and replaces a website which already features on page 1.
So why is this happening?

Debatable (as everything always is in SEO) but if I had to guess I’d say Google is either using this method to test CTR’s and then deciding where they should rank in the long term (there are 2 websites which started this pattern but stayed on page 1 after the first cycle) – or this is a method of ensuring that 15-20 websites get a % of exposure on page 1, which would also make sense as there are so many website with equal signals.

I’ve only benchmarked these results for 1 keyword but I also know something similar happens in another vertical, so this may be much bigger than is first thought – regardless I would very much like to hear if anyone else has observed such activity in Google.

Matt Ridout

About the author

My name is Matt Ridout, I've been working in digital marketing for 9 years; worked for agencies and currently Head of SEO at fashion startup called Farfetch. Try to test my own theories.


  1. Hi Matt Ridout,
    Thanks for this great post!
    After the Google Panda and Penguin Update,the CTR data will definitely bring a change in the online marketing.It will help in getting good ranks in SERP.

    Thanks again

  2. Nothing new. Google said something about that a while back.
    If a website is ranked high by the algorithm, it can be brought down if user pay a visit, but come back immediately to the SERPS and pick another URL.

Leave a Comment