Google Panda and Penguin UpdatesWith so many websites relying on traffic from Google, the search engine is uniquely positioned to influence where the internet goes next. But as it continues to grow, Google’s job has become increasingly difficult. It’s not just big businesses and tech geeks who are building web pages now; almost anyone with an internet connection can throw up a page relatively easily.

This has meant that not everyone is producing a quality product and an increasingly large percentage of people are just out to make a fast buck. As a result, the quality of websites has started to deteriorate, with more and more people pushing advertising and spam without offering anything of value.

Over the past 12 months, Google has shifted its focus to tackle those websites that offer little or nothing of interest to visitors, pushing them down the search listings while raising the profile of those websites that provide good quality original content.

Two algorithm changes – Panda and Penguin – have been key in this. Following a number of tweaks and adjustments after the original launch the changes are now really starting to come together to make 2012 the year we will remember for Google’s efforts to improve the quality of content on the internet.

Google Panda Update

The Panda update came first, initially launched in the US in February 2011. Various additions have been made to Panda since, with the update eventually extended to deliver a global coverage of English-speaking countries in April 2011. It has become the longest-running named Google algorithm update with adjustments being announced well into 2012.

Upon its initial launch, Google engineers Amit Singhal and Matt Cutts stated: “In the last day or so we launched a pretty big algorithmic improvement to our ranking – a change that noticeably impacts 11.8 per cent of our queries – and we wanted to let people know what’s going on.

“This update is designed to reduce rankings for low-quality sites – sites which are low-value add for users, copy content from other websites or sites that are just not very helpful.”

Ultimately, Panda was designed to deal with site quality. It aimed to lower the rank of poor quality sites and raise that of websites that put time and effort into their content. Reports at the time showed that news websites and social networking sites saw a surge in their rankings, while pages with a lot of advertising fell.

However, things didn’t run entirely smoothly and some sites complained that they were being unfairly punished in the rankings. There were some faults on Google’s end, but the search engine refused to address individual cases and instead urged developers to work hard on their site quality in line with Google’s quality guidelines.

Amit Singhal, writing on Google’s official blog, stated that tests show the algorithm is “very accurate at detecting site quality”. He added: “If you believe your site is high-quality and has been impacted by this change, we encourage you to evaluate the different aspects of your site extensively … As sites change, our algorithmic rankings will update to reflect that.”

These are: Make pages for users, not search engines. Avoid tricks to improve SEO. Ask yourself if you’d still do this if search engines didn’t exist. Don’t participate in link schemes that involve anything dodgy. Avoid using unauthorised computer programs to submit pages as these violate Google’s Terms of Service due to the fact that they use extra computing resources – as such sites that do employ these techniques tend to be punished in the rankings. Avoid hidden text and links. Avoid cloaking and sneaky redirects. Don’t send automated queries to Google. Don’t put irrelevant keywords in your page. Don’t use duplicate content. Don’t create phishing sites or anything with malware in it. If you participate in an affiliate programme ensure that your site adds value – Google suggests using unique and relevant content to give users a reason to visit your site first. Submit your site for reconsideration after you’ve modified your website to fit in with these guidelines.

Google Penguin Update

So if Panda is all about quality, then what’s the deal with Penguin? Many of the areas the Penguin tackled crossed over with things that Panda looked at. Purposefully duplicated content, for example, was something that both algorithm updates are thought to have hit hard, pushing sites down the rankings for using software to rehash content on multiple sites or straight up stealing content form other sites.

But Penguin, launched in April 2012, went one step further and looked at a range of more specific ‘black hat’ SEO tactics. Matt Cutts explained: “In the pursuit of higher rankings or traffic, a few sites use techniques that don’t benefit users, where the intent is to look for shortcuts or loopholes that would rank pages higher than they deserve to be ranked.”

These techniques are employed on a surprisingly common basis and include everything from keyword stuffing to dodgy link schemes. Penguin targeted sites that employed these tactics and others, punishing them in the rankings for neglecting to develop content that would actually be of some interest to their users.

The latest update to Penguin came just this month (October), with engineer Mr Cutts explaining that around 0.3 per cent of English language Google queries likely to be affected by the update. As ever, the search engine remained elusive about the exact changes to the algorithm, but it is understood that web spammers are once again the main target.

Of course the vast majority of sites avoid dodgy SEO tactics like link spamming and keyword stuffing and so will see no difference in their ranking following this update. But this hasn’t stopped the same old worries developing among SEO experts and web developers.

Google has been tackling issues with poor quality content and websites almost since it began operating as a search engine. But 2012 has been the year when things have really started to come together, with Panda and Penguin affecting a significant proportion of the internet (between ten and 15 per cent of sites upon the launch of Panda and around three per cent for Penguin).

This has only been possible as Google has started to hone its skills and gain a greater understanding of the way in which websites operate. Bear in mind the company was only incorporated as a privately held company in 1998 and the business of search is still a comparatively young industry. Things will continue to change and evolve but it feels to us like Google is taking more and more control over the situation, shunting useless sites away from its top pages and rewarding sites that invest in their content.

There is nothing wrong with Google’s goal of supporting websites that provide something genuinely useful and interesting for their users. The search engine has come out in favour of good SEO tactics and praises websites that are arranged in a way that makes them easy to crawl and rank as well as sites that are stuffed with useful information for their users. But SEO can’t come at the expense of a website’s users – it needs to appease the search engines and the visitors and it appears that Google is making it increasingly clear that content is the best way of doing this.



Leave a reply


<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>