Submit post
Go to Blog

Google Penguin 4.0 Update - The Good, The Bad and The Ugly

The Wow-Score shows how engaging a blog post is. It is calculated based on the correlation between users’ active reading time, their scrolling speed and the article’s length.
Learn more

Google Penguin 4.0 Update - The Good, The Bad and The Ugly

Anastasia Sidko
Google Penguin 4.0 Update - The Good, The Bad and The Ugly

Last Friday we shared some big news with SEOs all over the world: Google has officially announced the release of its Penguin Update 4.0. Penguin now becomes part of the core Google algorithm and runs in real time.

Before we see what impact these changes will have on the search industry, let’s take a look back.

The world first heard about the Penguin Update in April 2012. Its primary purpose was to deal with websites that had acquired links unnaturally in order to boost their rankings. After Google rolled out this update,  which affected about 3.1 percent of queries, plenty of websites were subject to a Google penalty and dropped out of search as a result. Thus the rules of the game changed completely, and a whole new era in link building began.

Recovering from a Google penalty has always taken a long time, and the new version of Penguin that Google rolled out on September 23 brought about some significant changes. “Historically, the list of sites affected by Penguin was periodically refreshed at the same time. Now Penguin is real-time, so changes will be visible much faster, typically taking effect shortly after we recrawl and reindex a page,” wrote Gary Illyes on the Google Webmaster blog.

Let’s review the key features of Penguin 4.0 and see how they will affect the work of SEO specialists:

  1. Penguin 4.0 was released for all countries and languages at the same time. This means that all websites will experience the same impact independent of their location; the rules are the same for everyone.

  2. Penguin is real-time now. This implies that any page’s rankings many change each time Google updates the information about this page and about pages linking to it. So, both the positive and negative impacts of your actions will become noticeable faster.

  3. The real-time algorithm allows you to recover from a Google penalty faster. If your site is penalized and your webmaster does a good job improving it and removing harmful links, you’ll see positive results sooner, right after Google recrawls your website’s pages.

  4. From now on, Penguin is a part of Google’s core algorithm, not a separate filter. That’s why we’ve seen the last official Google announcement about the update. Now you’ll need to continuously monitor your backlink profile.

  5. Penguin is becoming more granular. Simply speaking, the algorithm is now focused on the specific pages toxic links are pointing to, rather than on the whole websites. Such micro-penalties are more difficult to track, which is why it is now essential to audit all important subdomains and specific pages of your website on a regular basis.

How to benefit from Penguin 4.0?

One of the reasons Penguin is so dangerous is because you will never be informed if your website is penalized. You can only assume that you were penalized by indirect signs, like a sharp drop in organic traffic or declined positions for some (or all) of your keywords, including the branded ones.

Because the updated algorithm allows you to see the results of your actions quickly, it can be a double-edge sword. The recovery process becomes faster, but so does the penalization process. Here is what you can do to protect your website from unexpected drops in rankings:

  • Constantly monitor your positions in SERPs. Set up regular email alerts to stay informed of any position changes. Make sure you’re tracking not only keywords your homepage is ranking for, but also keywords for all significant subdivisions. Tom Clark suggests "comparing keyword rankings and investigating which keywords have taken a sudden drop in their position."

  • Keep your backlink profile healthy and clean. Make a habit of regularly checking new and lost backlinks, their value and quality, and handling the suspicious ones.

  • Conduct a deep backlink audit of your entire website. Because the new Penguin takes a granular approach, it is essential to audit backlinks for each important subdomain and subdivision of your website (for example, conduct separate audit for each of different language version of your website). 

  • Set up properties in Google Search Console for each important subdomain or subpath and update your disavow file as needed. You can use SEMrush Backlink Audit tool to easily and properly format disavow files.

The Penguin Update and the future of link building

In general, Penguin’s “real-time” feature implies that you can recover from penalty quicker. The new update allows you to conduct various experiments with a reduced cost of error – as long as you are competent and careful.

However, this could in theory lead to savvy SEOs trying to cheat the system and manipulate SERP results again – for example, by artificially acquiring backlinks for concrete subdomains or URLs and observing Google’s reaction to their experiments. We’ve asked some experts to share their opinion if the new Penguin Update is going to lead to a new wave of grey-hat optimization.

Barry Schwartz

I suspect the real-time nature will make it hard for spammers to know if they got hit by Panda, Penguin, or something else. So I am not sure if it will make it easier or harder. Time will tell.

Cory Collins

In short, no. I do not believe that Penguin moving to the core algorithm and running in “real-time” will lead to more grey-hat optimization, for two reasons:

1. Real-time isn’t instantaneous.

Even though Google uses the language “real-time” there will still be a lag time in seeing the results of newly secured links. In theory, it will only happen when Google crawls both your page and the page linking to your site.

2. Google stated this update made Penguin “more granular”.

The exact language Google used to describe the granular nature of Penguin: “Penguin now devalues spam by adjusting ranking based on spam signals, rather than affecting ranking of the whole site.

Some SEOs have theorized that the new Penguin algorithm might simply ignore spam links, rather than enforce a punitive measure. The Penguin algorithm would affect pages more than sites, by Google’s statement. This would largely defeat negative SEO, but also further obfuscate the effect of grey hat links designed to test and manipulate Penguin.

In short, I believe it will be extremely difficult to actually measure and understand the effect of manipulative links on the Penguin algorithm. Google tends to get better over time, and I believe that this new update will continue that trend.

Sha Menz

I think there are a couple of things to be concerned about with this new implementation of Penguin.

Google has effectively given the power to experiment (and perhaps to manipulate the algorithm) back to those who have the ability to spend time and resources on it. I see two key groups who could be likely to take advantage of this in ways that may be detrimental, both to competitors and to the general quality of the SERPs.

First, those with a penchant for pushing experimental limits in order to gain insights on the workings of Google’s core algorithm. For these people unknowns are now able to be tested, limits pushed, and SERPs impacted once again.

Perhaps more concerning is the specter of giving malicious operators the ability to really test the effectiveness of negative SEO efforts. While Google’s stated aim has always been to ensure the Penguin algorithm is immune to negative SEO, initial signs are not great. The first client to contact me after the release of the new Penguin algorithm was a business owner who had been watching massive numbers of spammy links built to his site for months, ignoring them on the advice of those who believed “Google is generally very good at identifying negative SEO”. By twenty-four hours after release, his site had taken a significant hit in visibility.

While I’m pleased to finally have a more responsive Penguin on the block, there is much to watch out for as all of this shakes out. In the meantime, we’ll continue to focus on helping those dealing with manual actions, for whom nothing changes without hard work and attention to detail.

Glenn Gabe

We’re learning more and more each day with Penguin, which is awesome. I said the other day that disavowing would probably not be needed anymore, and I was right! Gary confirmed that yesterday. And then he explained that previous Penguin penalties (suppression) were being removed over the next few days. That’s HUGE news.

Anyway, I’m going to hold off until more data comes in. I’ve been heavily analyzing many domains that were previously impacted. I think the next week will be very interesting.

Ale Agostini

Although it is a bit early to evaluate this news, there are a couple of interesting aspects: the Penguin becomes part of Google core algorithm and therefore becomes a ranking factor. If it's true, it means, that as soon as someone tries to fool Google, the algorithm will immediately adjust the rankings. In Bruce Clay we are analyzing this actual impact, and we will have more data and analysis in late October.

Rick Lomas

I don’t think anything Google has ever done has stopped SEOs from finding ways to cheat the system! But yes, I can see how Penguin 4.0 could be an advantage for SEOs experimenting with artificial links. Everyone has been talking about this since Penguin 4.0 was announced, but it is too early to tell if this is really the case.

Grey-hat SEOs will certainly have to adapt their processes, black-hat SEOs too. It’s like drug use in athletics; unless you are superhuman, you know that to win you may have to artificially boost your performance, but if you get caught, your career could be over forever. The more worrying thing for me is that the granularity of Penguin 4.0 might make it easier for negative SEOs to take over SERPs for one particular keyword or micro-niche without detection.

5 common mistakes made when disavowing links

I’ve already pointed out that being real-time Penguin 4.0. makes it absolutely essential to regularly monitor your backlinks and get rid of dangerous ones. We’ve analyzed 30 000 campaigns created in the SEMrush Backlink Audit tool in order to see how SEOs disavow links and identify their most common mistakes.

21 percent of disavow files we’ve researched do not comply with Google Search Console rules. Let’s look at the most frequently made mistakes:


1) Not adding “domain” in front of the domain name when disavowing links from an entire domain

Google only recognizes lines that begin with http or domain. If you write, Google will ignore this line and won’t disavow it.

Correct: domain:


2) Using quotation marks in the beginning of a line

Again, Google will simply ignore a line that begins with anything except http or domain.


Incorrect: ‘’

3) Incorrect file formatting

Using an incorrect line separator in some tools may result in multi-line data being merged into a single line. Google will not be able to recognize the file correctly and apply disavow rules.

4) Including only URLs in disavow file

Sometimes a link from a single article can appear on a few other pages of a website – say, pages featuring top posts of the month, posts by category etc. So if you want to ensure Google is not taking into account the connection between your website and a specific website, disavowing a particular URL is not enough. You need to disavow this entire domain.

5) Adding comments without using “#”

Commenting on your disavow file is not obligatory. However, if you decide to provide additional information, make sure it doesn’t prevent Google from parsing your file.

After uploading your disavow file, always check to see how many lines Google recognized correctly. If the number doesn’t match the number of lines in your request, check the file for compliance with GSC’s rules.

We recommend you to start auditing your backlink profile as soon as possible and get used to doing it regularly.

SEMrush Backlink Audit Tool

Like this post? Follow us on RSS and read more interesting posts:

Anastasia Sidko, Content Manager at SEMrush. I have four years of experience in content creation and public relations. My areas of interest include SEO, digital marketing and content marketing.
Share this post


2000 symbols remain
Great Share - I need to read this blog regularly for better SEO insights.
Lauren Jaxson
my site is losing traffic but i didn't get hit by google penguin 4.0.
i couldn't be able to find the solution for it.
what would be the reason of dropping google traffic?
Plus if it is really a penguin 4.0 penalty then how can i be recover??
Lauren Jaxson
Hi Lauren

If you would had been hit actual penalty, you'll get manual warning in GWT account.

Regarding drop in traffic this can be due to several reasons, if you're not relying on PPC and completely depend on SEO than I think you should look into following areas

1. Keyword ranking
2. Analyse competition
3. Detox bad links
4. Check for important 404 and 301 them

Above steps should help you to regain traffic - Further, if it's a small dip I wouldn't bother much as this can be seasonal and would concentrate more on building up content and plan new marketing strategies.

Hope this helps - Have a nice day ahead - Do let me know the results!
My sincere thanks to the author of this post.
VERY Helpful!
Hey there,
Thanks for updating all of us with this important update.
Mobil Mindz
Thanks for updating us and given full description about google penguin and effect of its.
Ritesh Jha
Great point to make SERP's Rank soon...thanks a lot to share. I want to increase back links for this website anyone suggest me what i need to do?
Lê Thụy
I have one site, after the last update of keywords decreased markedly although no spam messages in GWT, and still no sign of recovery, whether influenced by Penguin 4.0. sorry because I did not know much English!
Hopefully real time algorithm will help us improve websites faster and more efficiently
Mike Claggett
Some SEO's are saying this new update will penalize a site with syndicated posts on branded Web 2.0 blogs with backlinks to a money site. Any truth to that?
Mike Claggett
Well Mike, it shouldn't be a problem if web 2.0 Blog is not dodge and content is comprehensible! :)
Finger crossing that this update will be a good solution for the "negative SEO".
Hi everyone, im back with a question, sorry as I dont really have anyone to ask.
In google search console, I have setup my wesbsite as http:// | now; problem I see is that, the fetch feature is not working for me, giving me just a "Redirected" notice as when time went on I switched to https. Is this affecting my website's ability to be crawled the best way? and is this affecting why my disavow'd submitted file has been in limbo for 1.5 months now?
A solution Im considering is to create a new analytics account and setup https:// then embed the new codes on my site. any advice for my questions?
Marc Co
Hi, you'll need to add a new website to google webmasters - Essentially your site, but with HTTPS in front of it. It shouldn't affect rankings at all.
Paul Davies
Great to see my 'Go To' backlink detox expert Rick Lomas featured here on SemRush, The most helpful and knowledgeable guy I have come across yet ! Thanks for the article @Anastasia Sidko.
wow, one have to be very careful to avoid been penalize
Great post ! Thanks a lot for the info - I had this issue with a particular keyword wherein I we were on 1st page today and tomorrow no where to be seen but it's all good now. On another noted, we had been rankings on page 1 for a specific keyword for a considerable length of time and all of a sudden it dropped. After analysing meticulously I found out that our competitor had spam some links, which eventually I have disavow and it's been more than 3 weeks now. But, had no luck not sure if site been affected due to penguin update or anything else - Anyone facing the same issue ?
I am facing the same thing. I have been ranking for 1st position for as long as 2 years, now, I can't even find the brand term!!
Anyone could help on that?
Samer Riad
Samer, it's inconsequential fluctuation and will stabilize in few weeks that's what I have figured out. Meanwhile, I would recommend to detox toxic links and build a strategy for High PR links. On other note, I would encourage you to update your my business details this will help you get listed in local pack of 3. I know all are talking about penguin 4.0 update. But, recently google has updated its algorithm in regards to local listing! Hope this helps
Thanks Aryan for your kind feedback. The problem is not a simple fluctuation. I am totally penalized and my website is demoted. Not only that, I got my website removed from Google My Business page (already kept up to date).
What do you mean by local pack of 3?
Thanks again.
Samer Riad
If you've been penalised you will get physical warning from google in GWT. About, local pack 3 check for Possum update to get gist of it
Savannah Zhu
Excellent reads! One thing to add on, Gary Illyes from Google said Google is managed to devalue spam instead of demoting the site. Historically, the search engine gave sites that it detected bad links penalties by demoting their ranking on SERP, but now, unofficially according to Google, it would only devalue – that is to say, ignore – the spam on the site without intentionally drop the site’s ranking. But we stil have to disavow, as we did before. For brands who have or had unnatural backlinks, you still need to disavow files even though it makes no difference as Google devalues/ignores them anyway. It’s just not as bad if you don’t. Think about it this way, your site’s ranking either go up or drop down, if Google devalues the spam, which were created to increase your ranking, your ranking will drop down naturally. Our blog has some more thoughts about the demotes vs. devalues and the new update’s impact on SEOs:
Louise Briggs
Really interesting development. My site seems OK for now. I have noticed in my SEM Rush account, some keywords are now on page 1 of google rather than their previous pages 5-6. One keyword with 4400 monthly searches, is now 7th in Google, rather than on page 5 where it had been for many months. Looking at the new SERPs for that keyword I can understand why other websites have been dropped (spam backlinks). Just by that one keyword getting to page one we have increased our quote requests from around 15 a day to around 40 a day. This this keeps up Ill be buying a new house next summer :D lol Have to keep an eye on this over the coming weeks. Thanks SEM Rush, your site and tools are awesome!! :) xxx
Umer Idrisi
Great discussion here! Good information
I uploaded a disavow file first week of Aug and it implemented on SEMrush within 3 to 5 days, showing as nf. Now; I had to update the disavow file first week of Sept and it has been a month now and still showing no result of my update :( I have checked and concluded that my file is correct but has fallen under the Limbo period of what others are describing as " it may take weeks or even months!" #HulkRaaage!!!
Anastasia Sidko
Marc Co
Hi, Marc! As far as I know, although the algorithm is called “real-time”, actually it refreshes the data only after Google has crawled all pages - both of your site and referring sites, that’s why it can take that long.
Marc Co
Marc, You can go in search console and submit URL for indexing. You can do under crawl > Fetch as google and submit the link. Hopefully that should do the trick - I have done the same
Hi Aryan; now Im seeing another problem, I setup my search console years ago as http:// , then I switched to https:// | so at the "Fetch" page im only seeing a "Redirected" warning/notice. is this affecting me bigtime? should I create another analytics account, set it up as https:// and embed the new analytics code?
Marc Co
Marc, lets break this down gradually

1. You have already disavow links that direct to http:// - Correct ?
2. Further, for all the http:// you have created 301's to https:// - Correct ?

If, answer to above question's is yes! Than don't worry, your disavow list should be executed or might have already been executed

In to account, your have mentioned about creating new account and adding new snippet - that's wrong approach. You need to use change address tool You can find details here :

Further, if the links are disavowed in SEMrush you will see them under lost links. Hope this helps!

I think I should make a video on this as I have seen many clients facing this issue. huh :)
Thanks to SEMrush for the Backlink audit tool. I'm already using it.
Anastasia Sidko
Great to hear it! :)
Abdul Sami
Thanks for making it more clear for me :)
Anastasia Sidko
Abdul Sami
Glad it was useful for you, Abdul :)
Penguin 4.0 is slowly rolling out and its real time that with effect the web pages of site. Each day the page is crawled or index the page get refreshed. Thanks for sharing about Penguin 4.0 update. Hope to get more interesting content from your side to read in future.
Anastasia Sidko
Unlimited Exposure
Thank you for your nice feedback!
Anastasia Sidko
Your welcome
As Barry said, it will be difficult to find out what caused to drop your rankings or traffic. but the good part is that you don't have to wait till next release of penguin algo to recover if the drop has caused by penguin.
Anastasia Sidko
That's true, and also with this real-time algo you can correct and adjust your strategy immediately rather than continuing moving the wrong way.
Anastasia Sidko
Nina - foremost 1st, avoid getting in to spamming scheme - It's a dead end! From my experience major things that can affect you site rankings is Toxic and Money links. Further, with help of Possum algorithm update it's made it more easier for local business to be in local pack of 3.

Send feedback

Your feedback must contain at least 3 words (10 characters).

We will only use this email to respond to you on your feedback. Privacy Policy

Thank you for your feedback!