0115 964 8205

A Google penalty is when your site fails to meet Google’s webmaster quality guidelines and as a result is punished by ranking lower in the search results or being removed from the results altogether.

There are different types of Google penalty and various things you can do to avoid getting them, as well as recover if your site has been hit by a penalty. This post will explain all.

Algorithmic Penalties vs Manual Actions

Google penalties can be split into two categories: algorithmic penalties and manual actions. While algorithmic penalties are a byproduct of the algorithms Google uses to select search results, manual actions are carried out by actual people who work at Google.

Algorithmic penalties

Google decides which search results to show to users using a series of complex algorithms. These algorithms take into account a wide range of different factors in order to determine the quality of a site and how relevant it is to a particular search query.

While the exact details of Google’s algorithms and the individual ranking factors haven’t been made public, we do know that the algorithms favour sites that follow the basic principles of Google’s quality guidelines – namely, pages that are informative, helpful and easy for people to use and that don’t employ deceptive and spammy tactics to openly try and manipulate search engines.

Therefore, if your website doesn’t follow all the quality guidelines, it will likely be penalised by Google’s algorithms.

Over the past decade or so there have been a few big algorithm updates that have targeted specific “black hat” (dodgy) SEO trends…

Panda (2011)

Google’s Panda algorithm update was designed to target low quality and thin content on websites. Part of the motivation behind the update was the increasing number of online “content farms” at the time: sites that churned out tonnes of low-quality blog posts and pieces of content, often on a wide range of different subjects, in order to try and rank in Google. Instead, the new algorithm favoured quality, well researched, unique web content from sites that were an authority on a certain subject.

google panda

Avoiding/recovering from Panda issues:

  • Remove any thin content from your site that doesn’t add value
  • Make sure spelling and grammar of content is tip-top
  • Avoid content that is generic and derivative
  • Fill your site with good quality content that is interesting and/or useful to searchers

HTTPS/SSL Update (2014)

Remember when all websites started changing from http to https? This was all part of a move to make websites more safe and secure for users, and in 2014 Google officially announced that it was going to start using https as a ranking signal. Therefore, since then, sites that do not have an SSL certificate (and are therefore only on http) are less likely to rank in Google.

Avoiding/recovering from HTTPS/SSL issues:

  • Make sure your site has a valid SSL certificate!

Mobilegeddon (2015)

“Mobilegeddon” is the nickname given to Google’s Mobile Update in 2015, which meant that websites that were optimised for mobile were favoured in the rankings, while sites that weren’t faced penalisation. With Google now moving to the mobile first index, it remains as important as ever for SEO to make sure that your website is mobile-friendly.

Avoiding/recovering from Mobile issues:

Penguin 4.0 (2016)

Penguin 4.0 was arguably the most impactful of Google’s Penguin updates to date. Penguin has always targeted spammy link building, but the main difference with Penguin 4.0 is that it now meant links were analysed in real time. While previously, Google would analyse inbound links to a site from time to time, often resulting in algorithmic penalties that couldn’t be counteracted for a while, since 4.0, your site can recover much more quickly once bad links are removed. Also, Penguin 4.0 shifted the focus to devaluing the links themselves, rather than the whole site. However, websites can still incur penalties for large amounts of spammy links.

google penuin

Avoiding/recovering from Penguin issues:

 

  • Make sure links you build to your site are always from high quality, relevant websites
  • Carry out regular link audits to check the links that are coming into your site, and submit a disavow file to Google counteract any spammy links

Manual actions

Manual actions are actions taken against a site by a human reviewer at Google, when they have deemed that a page or a whole site is in violation of their webmaster quality guidelines. The page or pages in violation, or indeed the whole website, will be ranked lower or will be omitted in the search results pages.

Website owners or webmasters will be notified of any manual actions in Google Search Console. Unlike with algorithmic penalties, which can be solved by making changes to your website, to resolve a manual action you need to request a reconsideration review from Google (after you’ve rectified the issues, of course). Below, we’ve listed the different types of manual actions you can incur and how to recover from them.

Unnatural links to or from your site

As we mentioned above, unnatural links can result in algorithmic penalties, but they can also catch the attention of reviewers at Google and result in a manual action. For inbound links, again you should try and get any spammy links removed by contacting the owners of the sites linking to you and, if this fails, using the disavow links tool. But it’s not just about inbound links: if your site is linking out to a suspicious number of irrelevant sites, this can also cause a manual action. This is because it is likely that you might be participating in a link scheme, which violates Google’s webmaster guidelines. You need to either remove these links, add a rel=”nofollow” attribute or redirect them through a page that is blocked by robots.txt.

Pure spam or user-generated spam

Spam is always going to be a red flag for Google, whether it’s generated by a site owner or by site users. Pure spam can involve scraped content, cloaking (see more on this below) automatically generated gibberish, and other content that violates Google’s quality guidelines. User-generated spam tends to occur on user profiles, guestbook pages and forum pages. It tends to come in the form of adverts, gibberish text and irrelevant links, and often come from users with fake, commercial names. If you have a manual action for any kind of spam, you need to identify and remove it, then request a review of the manual action by Google.

no spam sign

Thin/low-value content

Again, having low-quality, thin content on your site can mean you incur algorithmic penalties, but in some cases it can also result in manual ations. Specifically, content that is scraped, automatically-generated or from affiliate programs can be picked up by Google and result in a manual action, as can doorway pages that aim to funnel users to the same place. To recover from this type of manual action, you’ll need to review all the content on your site and make sure that it is all relevant and provides value to users. Usually, the content types mentioned above do not – if this is the case, remove them.

Incorrect use of structured data

Google has a set of guidelines to ensure that structured data is used properly to mark up correct and relevant information on a site. If structured data is misused this can result in a manual action: namely if it is misleading and provides information that differs from that on the actually web page. This could be, for example, a company marked as a product, a promotion marked as an event, or incorrect details about a job posting. You can find a full breakdown of the different structured data issues here under the “Structured data issue” tab. If you receive this kind of manual action, you’ll need to remove or rectify your structured data markup and then request a review from Google.

Cloaking and sneaky redirects

vampire in cloak

These kinds of manual actions involve deliberate attempts to show different content to search engines than to users. This is in breach of Google’s guidelines as it prevents it from assessing the page as it appears to users and therefore manipulates and skews Google’s algorithm. Cloaking is adding content that is invisible to users but picked up by search engines. Cloaked images, for example, are images that are obscured by another image or a block of text, or are only served to Google. Similarly, hidden text – text the same colour as the background or in size 0 so that it cannot be seen by users – can result in a manual action. Sneaky redirects again try to serve users different content to Google, by redirecting users to a different page to what they’re expecting.

Be aware of this if you have a separate mobile version of your site as well. Make sure that there is not too much difference between the mobile or AMP version of your site, as it may then not be as relevant to the user as the Google search result suggests.

If you have a manual action due to one of these issues, you need to go through your site and identify and cloaked images, hidden text, or redirects which may appear as misleading, before requesting the manual action be reviewed.

Keyword stuffing

A lot of SEO is based around keywords, as websites are always trying to show up in the SERPS (search engine results pages) for particular keywords. As long as you understand how search engines work so that your keywords are present in your title and h1 tags and you write natural, relevant content about your product or service, then you will be fine. However, some people try to manipulate search engines by forcing large volumes of keywords into their page text. This might be serious over-repetition of a certain keyword, or might be a spammy list of lots of different keywords in the website’s source code or alt text, for example. Either way, Google can easily tell you are trying to manipulate search results and can penalise you with a manual action. To rectify this, simply remove all the spammy keywords and get the action reviewed.

Here you can view Google’s official definition of each individual manual action in the manual actions report and the steps you should take to rectify them.

Share

Author Biography

Rachel


Rachel has been working at Kumo since the start of 2018 and is a Search Engine Marketer and Head of Content. She makes sure all Kumo's clients always have fresh, engaging content that helps their websites rank and attract links.

She is also fluent in German, which comes in handy for any German websites that need content and SEO work!