Google is now using CTR for Rankings2
What with all Google’s various animal related algorithm updates happening along side it’s apparent “wake up” to websites buying links the landscape is always changing. There are often 2 areas that people will associate updates with;
- Either a variable will change in Google’s algorithm whereby it effects internal site signals (such as internal anchor text or Meta data values)
- Alternatively Google will change something related to external signals (such as social signals or match type for anchor text value)
The recent algorithm changes such as Panda and Penguin have illustrated just that, Panda being the much needed filter to remove low quality content from the index and Penguin the antidote to the mass amounts of low quality links that often appear unnaturally on make-shift blogs. But what of other algorithm behaviours and updates? There are around 500 algorithm changes a year, each targeting different areas such as security, speed, user experience, search results, universal results, Google+, Local and paid listing – however there is one area which doesn’t often get a lot of discussion and that’s the changes that Google makes to rankings based on performance in the SERPS.
As well as the on-site and off-site algorithm factors I believe there are other performance based updates happening which will also contribute to where your website ranks. While the debate of whether or not bounce rates impact search results has been on-going for many years Matt Cutt’s recently said that the web spam team did not share data – if you believe that or not that’s up to you. The area I’m looking at is CTRs in SERPS and how Google will test your website at various positions and results will help determine final rankings.
For the very broad, generic keywords that often have the highest search volumes you’ll find that most of the time there is actually very little difference between each website – they’ll target the same keywords on landing pages, have similar content strategies (blogs + news), be secure and technically sound and also have great quality links from reputable sources. So how is Google meant to decide who ranks where? Well social signals are more important obviously but even those can be relatively similar in some verticals. On possible answer is that Google is testing how websites perform in high positions and then making an assessment on where in the top 10 any given website should rank.
I started noticing that one website I worked on developed a pattern in the rankings over a set period of time. This pattern consisted of the following:
- Google was keeping the page 1 search results for a top keyword fresh by swapping 1-2 websites from page 2 in replacement of 1-2 websites already featured on page 1 every 3 weeks
- Once on the first page the websites would immediately move to a top 2-4 position for 3-5 days
- Next the websites moved to a lower position such as 5-7 for around 10 days
- The websites then moved to the bottom of page 1
- The websites were then swapped with 2 new websites from page 2
Debatable (as everything always is in SEO) but if I had to guess I’d say Google is either using this method to test CTR’s and then deciding where they should rank in the long term (there are 2 websites which started this pattern but stayed on page 1 after the first cycle) – or this is a method of ensuring that 15-20 websites get a % of exposure on page 1, which would also make sense as there are so many website with equal signals.
I’ve only benchmarked these results for 1 keyword but I also know something similar happens in another vertical, so this may be much bigger than is first thought – regardless I would very much like to hear if anyone else has observed such activity in Google.