Google’s latest Penguin update exploded into action yesterday (October 4th), the fifth such update in the last 18 months. The update is said to affect around 1% of search queries to a noticeable degree, according to Matt Cutts, speaking via Twitter.
Penguin 2.1 launching today. Affects ~1% of searches to a noticeable degree. More info on Penguin: http://t.co/4YSh4sfZQj
— Matt Cutts (@mattcutts) October 4, 2013
The Penguin evolution thus far reads as follows:
1. Update 1 – April 24th 2012 – 3.1% queries affected
2. Update 2 – May 26th 2012 – less than 0.1% queries affected
3. Update 3 – Oct 5th 2012 – 0.3 % queries affected (English language queries)
4. Update 4 – May 22th 2013 – 2.3 % of queries affected
5. Update 5 – Oct 4th 2013 – 1% of queries affected
The Penguin Algorithm if you didn’t already know is Google’s way of penalising websites that intentionally violate Google’s Webmaster Guidelines by adopting Black Hat SEO techniques, such as spamming and the use of link farming and low PR directories. It’s now looking very closely at all incoming links, especially those coming from non-moderated sites/blogs and those where exact match anchor text exists in low quality source backlinks in large numbers.
Furthermore, and something that’s long overdue, it’s also taking into account just how many of those incoming backlinks actually get clicked and followed to your website. After all, they get indexed (and carry weight to some degree), but if noone is clicking those indexed links, why should they carry any weight at all? Google has decided it doesn’t think they should.
Paid Links – The Proof is in The Pudding Penguin
Until now, I’ve heard many SEO experts claiming that Google isn’t really doing enough to penalise black hat SEO, especially those using purchased or paid links extensively, but I wasn’t going to take any chances with their statement. Quality articles, quality backlinks is, and always should be the order of the day and I refused to take the unnecessary chance, keeping faith with Google’s promises of rewarding quality over quantity.
This approach rewarded me with a page one ranking on some extremely competitive key phrases – web programmer for example where I was ranked #3 out of around 25 million pages.
But for some reason, I got curious…
Being a web programmer, I’m constantly looking for answers. I want to know how things work, why they work, and I have an abnormal desire to know the ins and outs of things that most people would just leave well alone. And for some reason, Google’s latest update resonated with me and I decided to do something that most would consider pretty stupid – I’m going to destroy my search engine rankings.
And before you agree with my stupidity, you should know a few things:
1. I’m doing this as a service to the SEO community. Since the introduction of Penguin (and Panda to some degree) I get lots of clients coming to me for SEO who have sites whose ranking is based on little more than some quite blatent black hat techniques in the past asking me to repair their rankings. And when I question the companies or individuals involved concerning their techniques in an attempt to ascertain just what they’ve done to break things (it’s always worth asking), they’re inevitably very smug. After all, they’ve taken someone’s money and they offered no promise or guarantees from the outset. These people really need to be stopped. There is a community of REAL SEO EXPERTS, and these people are destroying the market for all of us.
2. I’m confident I can repair my rankings. Maybe I’m naive, but I do believe the damage I do to my own rankings can be fixed. After all, as I mentioned I often get people coming to me to repair their SEO, and if I can’t manage to do it for my own site, I’m taking myself out of the game, period.
The Experiment
The moment I heard about the latest Google update I did something I wouldn’t normally do. I purchased some pretty low quality backlinks and what appeared on the surface to be questionable social bookmarking campaigns, and ordered them to be delivered in one day. Three hours later (yes, these people apparently work that hard to take your money!) they had been delivered.
What I didn’t anticipate was the speed at which Google picked up on these incoming links. Over the course of just one day, here is how my rankings changed for the term ‘web programmer’.
- Starting Position: #4 out of 24.7 million
- After 4 Hours: #5
- After 8 Hours: #35
- After 12 Hours: #55
- After 16 Hours: N/A (I need sleep too you know)
- After 22 Hours: #79
- After 2 Days: #85
- After 3 days: #91
- After 4 days: #98
The results speak for themselves, and they’re positive results for those who take SEO seriously and want to do things the right way.
The Malicious Downside
While I’m happy that Google are taking things seriously and the speed at which they’re taking these things into account, the thing I don’t particularly like is that the door is still open to those who utilise black hat techniques, only it’s a very different door.
It’s now possible to effectively destroy the competitions rankings, which in turn will boost your own. See your competition above you? Hit them with a plethora of cheap, nasty incoming links and I have no doubt you’ll leapfrog them and cause them a bucket load of unwanted stress and repair costs in the process. People are going to monetise this ‘service’ and that can’t be a good thing.
Hopefully Google will eventually address this too. Who knows? One thing’s for sure – they’re on the right track. Check back later for further updates as to just how my rankings have been affected, and additionally, what I can do to fix them.
Addendum:
A few points that have already been raised:
1. Use the Google disavow tool to remove unwanted backlinks. As Google says itself:
If you believe your site’s ranking is being harmed by low-quality links you do not control, you can ask Google not to take them into account when assessing your site. You should still make every effort to clean up unnatural links pointing to your site. Simply disavowing them isn’t enough.
2. The Twitter card I used of Matt Cutts’ tweet isn’t a plugin. Simply add the individual Tweet URL into WordPress (if you use it) and it will auto-parse and embed the card.
Brilliant! What do you think is the solution Google will come up with to combat the ability of a competitor to take down websites? I’m in an industry where many consider themselves hackers (computer repair) and it is very, very tempting I think for companies to destroy each other in this way. What is the answer?
The question should be, how could Google control that? It’s not possible unless you give Google control of every web server in the world :)