Understanding and Avoiding Google Penalties
If you’ve ever spent any hands-on time working on or researching digital marketing and SEO (Search Engine Optimization) for your website, you may have learned about how Google can sometimes penalize websites for certain practices.
If this is news to you, then you’ll find this piece even more enlightening! Today we’ll answer the common questions about Google penalties, including specific types of penalties, what they are, how they are applied and more.
As we all know, penalties can’t just be arbitrarily distributed out of nowhere on the whims of a given authority figure. Penalties are incurred for breaking rules. In our case here, those rules are the Google Quality Guidelines.
You’d be surprised how few webmasters have actually read these guidelines. The number is even fewer for business owners on the web. Yet, we are all pursuing top positioning in Google’s search engine. Following the guidelines is the key to a penalty-free website.
Some people break these rules by accident, or because they simply didn’t know about a certain guideline. Others may purposely break certain guidelines because they believe they can achieve better results and still possibly avoid a penalty. In any scenario, risking a Google penalty is a rather undesirable strategy
Web Spam & Manual Penalties
There are actually two different types of penalties. The first type is a manual penalty, which we’ll discuss now. The other type of penalty is automated, and they come from Google’s Algorithms and usually without any human interaction. We’ll discuss those in the next sections.
Headed up by well renowned Google staffer, Matt Cutts (pictured), the Web Spam Team is designed specifically to seek out spammers, search engine manipulators and rule-breakers. In so doing, Google is protecting the quality of their search results from those who seek to manufacture top positions through “black-hat” methods.
Manual penalties are sometimes the harshest consequences and can even be so severe that your website may be completely removed from Google’s Index. These penalties are applied by Google employees on the Web Spam Team.
If you’re caught in the Web Spam Team’s searchlight, they can apply harsh manual penalties to your website on the spot, followed by an inevitable sharp drop in rankings and traffic. This, of course, is something we never want to encounter. Simply following Google’s Guidelines is the best and easiest way to avoid them.
Now let’s talk about the automated penalties, namely the two most prevalent Google Algorithm Updates: Panda and Penguin. To be clear, Panda and Penguin are not penalties themselves, they are actually amendments to Google’s algorithm, whose function is to apply penalties to websites that aren’t up to expected standards.
While Google’s crawlers run through your site, Panda and Penguin are looking at certain aspects of your site, checking for red flags and other indicators that you may be engaging in practices (whether knowingly or not) that are counterproductive to Google’s high standard of quality in their search results.
These algorithms can apply negative weighting to your website, making it harder to achieve top positioning in Google search results. Let’s look at each of these animals a little more in-depth.
Panda is the on-page algorithm update and it focuses heavily on the content of your website.
One of the biggest flags Panda seeks out is duplicate content and web pages. It’s also keenly interested in rooting out sites with very thin content.
Many of the hallmarks of Web Spam are targets for Panda, mostly manifesting in the form of ad-heavy pages with little or no text.
In many cases, spammers create tons of websites, filling pages with thin, keyword-rich content, plagiarized articles and a slew of advertisements.
Easy pickings for Panda, these types of practices can draw harsh penalties to your site’s trust, authority and ranking score, along with negative weighting in the search results. Another factor Panda looks for is a high ratio of ads to content, or large advertisements (usually banners) ‘above the fold’ near the top of the site.
Google wants users to find high quality content, and that’s usually not fulfilled on websites where visitors are met with huge splashy ads upon entry.
The foremost way to keep Panda at bay is to craft high quality, unique content on your website. Ensure that every page is a valuable resource in its own way and contains original text, tags and images. Avoid copying or even ‘spinning’ articles and keep advertisements to a minimum.
Here we have the other black-and-white beast of Google fame, the Penguin algorithm update. Unlike its counterpart, Penguin is largely focused on your off-site optimization efforts; namely backlinks (inbound links from external sites to yours).
Because webmasters and marketers know Google must rely heavily on ranking websites based on their backlinks, they have become a commodity of aggressive spamming and SEO manipulation. As the practice of spammy ‘link building’ has run rampant, Penguin was deployed as Google’s answer to keep it in check.
The SEO community has recognized that Penguin still has to operate intelligently because they can’t just apply blanket penalties to anyone with a spam link. Therefore the algorithm must take many factors into account. This means it’s not just about poor quality links.
On the contrary, Penguin may look for signals that you’ve got links from websites with low authority and trust factors, or it could flag certain links that you may have purchased.
Link exchanges, cyclical linking and other types of reciprocal linking are also flagged. Links from “bad neighborhood sites” or sites that are clearly not relevant to yours are equally suspect and could draw Penguin’s wrath.
Another signal may not involve the links themselves, but rather the number of links acquired over a short period of time. If your website typically gets a link or two each month, and it suddenly gathers hundreds of them from all over the place in a span of a few weeks, expect a visit from the flightless arctic critter.
One interesting note about Penguin is that it doesn’t actually penalize your website in the same manner as Panda or manual penalties might. Penguin actually measures the quality and trust of your backlink profile and, if it warrants penalization, it simply devalues your backlinks. This will in turn result in your website having a lowered overall ranking potential in the search results.
No one wants to endure the consequences of Penguin penalization. Innumerable reports and examples exist on the web from regretful webmasters and CEOs languishing over losses attributed solely to Penguin’s fury. Needless to say, we try to avoid it all costs.
The best Penguin prevention plan involves very careful link acquisition practices. Avoid buying links, participating in link exchanges, manually creating spammy links and any other artificial or automated methods of building links.
Ideally, we want our backlinks to come naturally, as Google’s Guidelines dictate. We can nurture this process by creating great content, properly optimizing our websites and promoting through reputable channels.
While Penguin and Panda are the algorithmic all-stars in this show, the SEO community has picked up on a few other types of penalties, often lesser known with narrower impact scopes.
There are a handful of these outlier penalties, many of which simply don’t have a name. Of course, we expect there to be many more that are still as of yet unknown, still eluding us all behind Google’s curtain.
Some of these penalties are usually singular manual punishments doled out as a one-for-one punishment for offenders committing a spammy crime, such as a penalty for cloaking or for using a ‘doorway page’.
Others may be innocuous to a site’s practices but instead tackle an algorithmic problem in search results, such as the ‘domain crowding’ update, in which Google needed to address a problem whereby the same website was taking up multiple spots in the results. Technically not a penalty, but for pages losing spots in the top, it still hurts.
We may occasionally come across an unknown penalty, unannounced by Google yet confirmed through consensus of groups of webmasters. One such example was the “Phantom” update back in May of 2013. The details surrounding this mysterious update are still unknown, leaving only our own studies to shed light on the Phantom.
Staying apprised of Google Algorithm updates and their Quality Guidelines is key. One essential tool we like to use to remain vigilant when Google quietly rolls out new penalties is MozCast, a consistently-updated ‘weather report’ of sorts that helps raise awareness on occasions where Google may be up to something.
Considering all we’ve discussed, the takeaway is a sobering realization that Google will fiercely protect the quality of its search results. The best we can do is to produce websites and content in which Google is willing to invest, and the traffic will follow.