The Google Panda algorithm should need no introduction, at least not to anyone that has been involved in organic SEO during the last year or two.
At SMX recently, Matt Cutts, head of Google’s web spam team, not only announced that Google will be releasing a fresh to the Google Panda algorithm over the next day or so (some have noted changes to the SERPS already, so it is possible that the Panda refresh has already gone live), but Mr. Cutts also announced that the Panda algorithm will, moving forwards, be part of the search giants ‘normal’ algorithm.
What Are The Implications of Panda in General
Google Panda is somewhat notorious among webmasters, mainly due to the number of claimed ‘false positives’ that the ‘anti-web spam’ algorithm caused (if all the claims can be believed is another matter).
Originally dubbed the ‘farmer update’ by the SEO community (the name ‘Google Panda‘ was given some time later by a Google announcement), the update primarily targeted sites with low on-page quality signals, including ‘content farms’ (such as article directories that included lots of scraped or duplicated content).
The algorithm was also noted to target sites with duplicate content in general, sites with too many ‘above the fold’ adverts, and those with other ‘low quality’ signals.
However there were lots of claims of false positives, with sites that claimed they had never used ‘web spam’ techniques on their site being seemingly penalized by the algorithm. One possible cause of this is sites that have duplicate content without realizing (search queries being indexed, sort parameters with URL queries being indexed, etc). Although it is, as with any automated, algorithm based ranking system, possible that some websites were ranked lower due to genuine errors.
What Are The Implications of Google Panda Becoming Part of The Search Giant’s ‘Normal/Everyday’ Algorithm?
What this will mean in real terms for webmasters remains to be seen…
Some may say that with the Google Panda algorithm included in the ‘regular’ Google algorithm, there is a bigger risk of such false positives, as the ‘anti-spam’ Panda will be crawling the web more often, and will therefore be finding more sites to demote.
However, there are also positives…
Quicker Google Panda Recoveries?
Traditionally with algorithms that are not part of the ‘regular’ ranking algorithm, like Panda & Penguin, once demoted by a pass of one of these algorithms, one would have to first correct the issue, and then wait with baited breath for a refresh of the algorithm that caused the headache.
With Panda being upgraded to be part of the regular, rolling ranking algorithm, this ‘should’ (in theory at least) mean that if a website gets demoted due to a Panda related issue, then after tracking and fixing the elements that caused the issue, recovery should be much quicker – and could even be a matter of days, rather than waiting several weeks (or more) for the next Panda Refresh.
What’s Next For Google Penguin?
Just as Google’s Panda algorithm seeks to weed out websites with low quality on-page signals, Google Penguin seeks out sites that have low quality off-page signals (e.g. links). Much like Panda Penguin has traditionally been separate from the ‘regular’ Google ranking algorithm, with an initial ‘roll out’ and several ‘refreshes’ periodically.
Matt Cutts announced that, as well as the major Panda update and inclusion of the algorithm into Google’s ‘normal’ ranking systems, there will be a major Penguin update later this year.
Will we soon see both Panda AND Penguin become a part of the ‘regular’ Google ranking algorithm?
Do you feel that this would be a good thing, or does it strike fear into your heart?
We want to hear your thoughts, so let us know in the comments section below!