penguin thoughts

Thoughts on the Recent Google Penguin 3.0 Update

Google has updated the Penguin algorithm recently.  The update started on October 17th and according to Google’s John Mueller the update is still slowly rolling out and has been for weeks.  Many search engine specialists wonder if they are testing,  refreshing data or simply delaying the real update.  Unfortunately, Google really isn’t transparent about updates. This  leads to a lot of speculation by SEO professionals.

If you are new to SEO, then you may not realize that the Penguin algorithm penalizes sites that have poor, spammy or artifical links.  If you have a site that gets penalized, you have to analyze your site and then determine which links are bad and add them to a disavow list that you submit to Google.   Your site won’t get out of the penalty box until Google refreshes the back link data.

Identifying Bad Back Links

In order to remove a penguin penalty or manual penalty, you have to clean up your back links profile.   We try to identify the “bad” or “poor” or “spammy” links. We don’t have a straight definition of what Google considers bad.  So as analysts we take a logical guess at what Google would consider a bad link.  The major source of bad links comes from private blog networks, comment links from poor quality blogs/forums and low quality directory links.  Basically, any link that is quickly created without editorial review is usually a sign of a bad link.

There are other bad links that come from black hat SEO techniques.  Many years ago, before the duplicate content penalty, spammers could duplicate your site content and sometimes would rank for long tail keywords.  Hackers used automated cloning scripts and would clone sites at warp speed.  You really couldn’t do a thing about them as they were usually kept on Chinese or Russian servers.  Sometimes they would even outrank you and you would have to file a DMCA notification and prove you were the owner of your content.  Google has done a good job in recent years of crediting websites with original content so most site owners rarely have to file complaints these days.

Still, almost any site that has ever been ranked in Google will have some of these cloned sites or cloned directories that still link back to your domain.   It is just part of web spam and no one intentionally asks for those types links.

Link Risk Management

The process of cleaning up your back links is called Link Risk Management. This is a process that most site owners have to routinely implement.  On our larger site, we analyze our backlinks a few times a month. On smaller sites, we only check them once every couple of months.  When we conduct an audit of our back links, we identify links that are toxic and likely causing us more harm then good.  Once we identify those links we attempt to remove them if possible.  Whether we remove them or not, these links get put on a “disavow list” and submitted to Google using Webmaster Tools.

In order to help us identify a good and bad links we use specialized software.  My personal opinion is that LRT’s Link Detox is the best software for link risk management. I’m not affiliated with them other than I highly recommend their product. I’m also a member and I’ve completed the training to be a LRT Associate.

Expectations of a Penguin Back Link Data Refresh

If they actually refresh the data and take into consideration the disavow lists, then I expect some kind of movement in rankings for sites that got hit by Penguin and submitted a disavow list.  I don’t’ necessarily expect to be top 5 in every search.  I expect some sites will go up, some will go down.  I expect to see a number of keyword combinations show much more diverse results than they show today as I expect some sites to get hit with the penalty that haven’t been hit yet.  And I expect some sites to recover.  So even if my sites don’t actually recover, I still expect movement one way or another.

If rankings don’t change, then I have to believe that they have not processed the disavow lists.

So  do you think Google refreshed back link data? 

No. I don’t think they have fully processed back link data, if at all.  I know some SEO’s are talking about some sites recovering or getting penalized but not many.  This could be because they updated the algorithm, so some sites got hit and others on the fringe of too many bad back links got released from the penalty.

I am watching about 20 sites and have not seen any movement in rankings.  I am also watching the top 20 listings for a number of 2 and 3 word combinations on my largest sites and I’m not seeing any shift in those sites. I haven’t seen any new pages break into the top 20. I haven’t seen any pages drop out.   I actually saw more movement in the last 2 panda updates.

We are looking at sites with high traffic, medium traffic and low traffic. I look at local small business sites, mid level e-commerce stores, blogs, heavy content pages, sites with a lot of images and video.  Some of these sites have no disavow lists. Some we only disavowed a small percentage of total links. We have a site that disavowed 90% of the backlinks.    No movement at all.

So what happens next?

I hope that Google communicates with us and at least tells us if they processed the disavow lists.  If they did process the list, then I’ll need more feedback to determine why we saw no change.  I don’t believe this is the case, but if it is, then we make adjustments.   It’s possible that they are still in the process of updating, but we won’t see changes until its complete or until they run the algorithm against that refreshed data. Its possible they decided not to refresh the data against the disavow lists.  Google may also be having some kind of technical problem which is causing the delay.

Only thing left to do is wait.  Again.



Leave a Comment