Go to Blog

The Hummingbird Effect

Kathleen Garvin
The Hummingbird Effect

Google won’t stop surprising us.

At the end of September, Google announced the launch of a new Hummingbird algorithm. Traditionally, new algorithm launches cause alarm in the SEO community — everybody is afraid of SEO efforts that could fail.

How can you adjust to the new algorithm? How has Hummingbird influenced work in different parts of the world?

Here is the first answer we received from Diego Ivo, CEO of Agência Conversion (www.conversion.com.br), a major SEO company in Brazil.

1. Did you face any problems after the Hummingbird launch?

I'd like to first explain something about the global Google algorithm and its differences in the local market where I work. Here in Brazil, we have more than 40 million people connected to the Internet, and Google holds 91% of the market share. Therefore, Google is almost omnipresent.

It's also important to emphasize that the impact of the algorithm changes are not immediate. In Brazil, Google’s updates do not usually come right away, and here we have to wait about one to three months to really see it. That's why we haven't seen a big impact of Google Hummingbird on website traffic yet, not even a lot of complaints from webmasters, which is very common whenever there are updates to Google's algorithm. However, I see that many website owners are afraid — and I think they really have reasons to be!

All Google updates have greater impacts on websites that use some black or gray hat techniques, crossing the line of what is ethical in SEO. The other sites don't have negative impacts, and can even increase traffic with competitors falling on SERPs. With the accounts we manage, I bet on the second situation.

To give you an idea, whenever there is a significant update to Google's algorithm, especially like Panda and Penguin, too many sites actually lose a lot of traffic and our commercial team notices a huge increase in the demand for SEO service. So when Google Hummingbird works fully in Brazil, it seems that many sites will be impacted.

2. What should you pay attention to while making a keyword research now?

Another peculiarity of SEO in Brazil is that we have very few resources of information about keywords because most of the existing tools don't help out the Brazilian market. Around here, we always worked with the Google Keyword Tool, SEMrush and some other tools; plus, we focused on brainstorming and potential keywords analysis according to personal experiences on previous successful cases. Now that the Keyword Tool "died" and Keyword Planner does not help as much, SEMrush is one of the tools with the most extensive data, not only for keywords analysis, but also the total website's organic traffic. However, intelligence and teamwork prevail.

I personally like to give much attention to the whole website traffic and use the search volume for particular keywords as an important KPI. I have heavily used Paretto's calculation that way: if the volume of research is exact or at SEMrush is 1,000,000, I believe it represents about 20% of possible long tails (5,000,000). And I can consider this on the following statistics:

head tail

The 20/80 ratio is interesting as a principle and should be developed by considering all the market, products and services peculiarities. However, we can see that if this doesn't bring the exact number, it can at least give a good direction.

3. Will simple (one-word) keywords work any longer?

Yes. The main problem is not to optimize a determined keyword. The problem is the consideration of SEO as just keyword optimization and creation of very specific pages focused on determined keywords. The basis of the Google algorithm continues to search for keywords, but the real change comes on the way to treat the information, and especially on how to deliver the results, which is getting smarter.

Google has always been a good search engine, but not everyone knew how to use it. And many SEOs took advantage from this failure to create specific pages for certain keywords. However, with the improvement of semantics, Google will actually bring the answer without necessarily having the keyword optimized on that page.

In my view, the head tail keywords success is tied to a good information architecture, which represents the information on an organized and friendly way for users and Google. The solution is not creating doorway pages, but content that is relevant for users. As I said, there's no problem having keywords on these pages, but it should be natural. If it's not natural, simply don't optimize.

4. Will Hummingbird affect traffic volume due to low-frequency keywords?

There are many websites doing gray hat. Google's proposal is to find the answer on a page which is not necessarily focused on keywords. Pages should be focused on solving real problems from real people in an intelligent way, transmitting good values​​ and delivering a good experience. And from my experiences, I'm not talking about something technical, but something emotional. Something that gets to the heart of people.

Similarly, there are plenty of opportunities. There is no problem with knowing that a particular page is mostly focused on one keyword, but the most important is what goes beyond it. Do not repeat long tail keywords. Maybe you don't even need to use them in the form of phrase. Instead of focusing on repeating the words, the work should focus on the use of synonymous with information that makes sense, and on working the content semantics.

The contents should have an emotional appeal, which must be positive. Google updates force SEOs to join their technical knowledge and analytical capacity to reach the user’s heart. And it is amazing in our market!

5. Google suggests paying more attention to content, but people steal unique content! Is there any way to protect yourself? Or do you just have to update the content periodically?

I don't care about people stealing content from a site on which I work, and that's because we're concerned with being the first to have that page indexed. And even if it wasn't indexed, Google is able to identify those who copy content.

Content has always been very important for SEO, but Google over time had different standards for assessing content quality. The problem is that some SEO professionals, rather than producing good content, have discovered all these tricks. The fresh content rate was also a trick. I don't believe in tricks for SEO.

The most exciting thing about working with SEO is there are no master standards, no one thing that always works. The SEO business is knowing you have an algorithm that evaluates your site and will always do that. It's not enough to be good at something, you also need to look good at it.

Author bio:

Diego Ivo is the CEO at Conversion (www.conversion.com.br), a major SEO company in Brazil. He has worked in SEO for seven years. Diego also created #OpenSEO, a community composed of a forum and free online SEO course, which has helped thousands of professionals understand and implement SEO, and thus to strengthen the Brazilian market.

Like this post? Follow us on RSS and read more interesting posts:

Kathleen Garvin is an online editor and analytics wrangler for The Penny Hoarder, the largest personal finance blog. She's also a contributing writer for I Want Her Job and started a beginner-friendly digital marketing blog, The Maroon House. Kathleen is the former blog editor at SEMrush.
Share this post


2000 symbols remain
Kathleen Garvin
Thanks all for the comments! Diego Ivo provided some good information.
Bruno Piña Colada
otimized keywords for the fix this problem
great article katleen
"Pages should be focused on solving real problems from real people in a intelligent way". That's it.
Now we have another major algorithm update which is changing the nature of search as we know it. Humming is not something to fear.
SEO for Hummingbird: What It Does and How to Deal with Google’s Newest Update