Manual penalties: 5 things to help you speed up your recovery

Posted by James Newhouse on 3 Apr, 2014
View comments Content
So, why has Google changed the way it penalises sites that it sees as “over-optimized”?

If your site has been hit by a Google penalty recently, the chances are it wasn't a Penguin penalty or even a Panda penalty. Long gone are the days when SEO forums hummed with the chatter of black and white animals.

At Receptional (we are a digital agency) we have noticed that increasing numbers of site owners are asking us for help with Google-imposed penalties. If you've been hit, you'll know that the effects of a manual penalty can be tough and recovery can be even tougher. After helping several high-profile sites recover from manual penalties, here are the key lessons we learnt along the way.

Why so many manual penalties?

It seems likely that manual penalties are a by-product of the Penguin penalties that rolled out in 2012 and 2013. Many thousands of sites were affected by those updates, rankings fell, and to recover successfully, site owners were forced to disavow links. This process of disavowal provided Google with lots of information about link networks and other ways of linking that contravened Google's guidelines.

Once Google has identified a link network, it is able to manually penalise any sites that benefit from the links that that network provides, but haven't yet been hit by an algorithmic penalty - which is why we're now seeing so many manual penalties.

We recently helped a site with nine country-specific subdomains, each affected by a separate manual penalty. Google's message stated that the manual action was applied to devaluing specific links, rather than “rankings as a whole”, but the site’s traffic levels would beg to differ. Their business had been hit hard, despite a number of extremely high quality editorial links within their backlink profile.

In total, the root domain had about 3 million unique backlinks that needed to be evaluated and, if necessary, removed - which gives you a sense of the scale of our task.

The recovery process

When working on any project, but particularly with sites of this size, we like to have a clear process. These are the steps to recovery that we mapped out at the start of the project:

  1. Create a list of all linking URLs pointing to site. It is important to cross-check link data from Google's Webmaster Tools (WMT) with more extensive link intelligence databases, such as MajesticSEO.
  2. Manually assess each link to identify which ones to remove. We developed some tools that help us categorize links, so we can speed up the process (more on those later).
  3. Reach out to webmasters to get offending links taken down.
  4. Reach out for a second and third time to all the sites we did not hear back from and re-request removals.
  5. Create a disavow file for all links that were not removed after multiple attempts or where people requested we pay for removals. Submit the disavow file to Google.
  6. Write a re-consideration request and submit it to Google through WMT.
  7. Wait (sometimes up to six weeks) to see if the re-consideration request has been successful.

As an agency, It’s important for us to follow the above process rigorously in order to maintain our success rate.

The only way to really understand the root cause of a penalty is to look at every single link. Really, penalty recovery is a mirror image of link building – you’d never automate your link building, so why would you automate your link purification?

We make sure we address every violation of Google's guidelines, not just the obvious ones. And it's here that experience is invaluable.

So what are the lessons we can learn?

1 - Impress the humans at Google

A manual action is, well, … erm, manual. So, I'm although I’m not privy to the internal work patterns of Matt Cutts and Google's spam fighting team, I assume there are humans involved.

It's reasonable to assume, too, that 'flags' pop up when Google's system identifies spammy patterns in a site's backlink profile. Once enough of these flags are raised, a member of the spam team will investigate the site. They will decide on what form the penalty should take, its duration, severity and the nature of the remedial actions required for the penalty to be lifted.

 

 

It is a real person that is dealing with the review process, so it's likely you'll be able to appeal to their human nature during the review process. It’s a good idea to show some contrition. Mention that you've realised the error of your ways and commit to keeping to Google's terms and conditions in future. And it will help your case if you can demonstrate a rigorous approach to removing offending links.

Successful re-inclusion requests needn't be as long as a three-volume novel, but they should provide information about the actions taken. In other words, it's important to show your workings: the way you present your case will be crucial to a successful reconsideration request.

2 - Get comprehensive link data

Google can see a lot of links, but they won’t share all of these with you in WebMaster Tools.

For a successful recovery, it's helpful to have as much link data as possible. We use three main providers:

  1. Majestic SEO
  2. Moz, and of course...
  3. Google Webmaster Tools

Usually, Majestic contains more links than either WMT or Moz. In fact, in this case, Majestic was only 2% short of the Moz/WMT combined, unique total. When your link profile consists of 3 million unique links, however, this equates to 60,000 links. A hefty number to miss out on, I think you’d agree.

I also favour using Majestic’s historic index for manual penalties. This is because we don’t know how up-to-date Google’s WMT link map might be. So, it's best to be comprehensive.

Another great source of link data could be any link building reports that were generated by previous SEO agencies (or anyone else who may have contributed to the penalty).

3 - Use the right tools

While I'd stress the importance of manually reviewing every link that points at your site, there are a few tools we use to speed up the process.

First, we built a custom spreadsheet that makes collecting key data really easy.

If you haven’t already got access to it, I would suggest using the Majestic API – I use this to pull in Citation and Trust Flow metrics, as well as IP and Subnet information into our spreadsheet.

We use Screaming Frog to get server codes, page sizes and word count (and the number of outbound links, and page titles can be helpful too). This means we can quickly gain a good overall picture of each website. We look at the “title” column to pick out obviously spammy sites.


You can try using this Google Doc to check on the indexation status of a domain, subdomain or specific URL. Any domain that returns a 0 has been de-indexed by Google, so we know to disavow it. Simply make your own copy, and you can check up to 50 sites at a time.

By collecting all this data in one place, we find that we can quickly audit groups of links and make judgements more quickly - for instance, it is possible to pick out dead social bookmarks, parked domains or websites that are no longer up and running by filtering out rows that contain the server code 410, 404, 302, or 303.

If you see that a lot of sites on the same IP or Subnet that are returning the same responses, and have the same file size and word count, then you know that they’re most likely to all be in a similar state. You might even include the title and description field from Screaming Frog so that you can quickly pick out those old school directories that are cloned across hundreds of different domains.

Having all this information readily available for the person that reviews your reconsideration request demonstrates technical expertise and a rigorous approach, which means your application is more likely to be a success.

4 - Presentation, presentation, presentation

We are frequently approached by clients who have submitted several unsuccessful reconsideration requests.

Often, their reconsideration requests have failed, not because the work wasn’t put in, but because they were presented badly.

As we’ve said before – this is a human process, so if you make a document vague or difficult to decipher, you’re unlikely to be given the benefit of the doubt, and the penalty will remain in place.

Make your document clear and readable – ideally, it should follow a logical order, and demonstrate evidence of your completion of each of the steps we’ve mentioned before.

5 - Look beyond the metrics 

Metrics are helpful, but they aren’t everything – it's possible to use them as a guide to help you know which links need removing. But don't treat metrics as gospel, human judgment is needed too.

Simply removing or disavowing all links that don't hit a particular “trust” threshold is not only likely to fail, but also means there's a strong possibility you'll be removing valuable links that are helping your rankings.

Whilst some trust metrics are great for judging the quality of a linking URL’s authority, it’s certainly not a way of determining if a link is in violation of one or more of Google’s guidelines. And you only need to remove links that don't meet Google's terms and conditions.

If you have paid for links on other sites without adding a no-follow tag, you're contravening Google's terms. Buy enough links and it's likely you’ll create a pattern of behaviour that Google, or your competitors, will be able to detect. Your paid links might be the reason you’re being penalised, regardless of their pagerank, trust or authority metrics.

Physical removal

Physically removing spammy links is a stronger signal to Google about your repentance than simply disavowing them. It’s often difficult to persuade webmasters to remove links on their site, but we’ve found that co-operation levels are currently at an all-time high (don’t be blackmailed into paying for link removal).

When asking for links to be removed, we usually send three rounds of emails. Recently, we've been getting better response rates from our first and second mailings, but that third try also gets results and we feel that it's best to remove as many links that we don't want as possible.

Success!

After reviewing 3 million unique backlinks for this client, all nine manual actions were revoked. Reconsideration requests usually take up to six weeks to process, but the turnaround here happened in just seven days. We have had a 100% success rate in removing manual penalties, and we believe that the quality of the reconsideration requests was a major factor in us getting the 'all clear' so quickly.

 

Recent articles

Instagram Stories: 6 tips to boost engagement
Posted by Edith MacLeod on 9 April 2024
How to use LinkedIn to help business growth [Infographic]
Posted by Wordtracker on 8 April 2024
Google removed 12.7 million advertiser accounts in 2023
Posted by Edith MacLeod on 5 April 2024
Google's March 2024 updates - the impact so far
Posted by Edith MacLeod on 19 March 2024
Google is rolling out the March 2024 core and spam updates
Posted by Edith MacLeod on 15 March 2024