en
English Español Deutsch Français Italiano Português (Brasil) Русский 中文 日本語
Submit post
Go to Blog

Why Google Isn’t Giving Update Advice Anymore

91
Wow-Score
The Wow-Score shows how engaging a blog post is. It is calculated based on the correlation between users’ active reading time, their scrolling speed and the article’s length.

Why Google Isn’t Giving Update Advice Anymore

John Warner
Why Google Isn’t Giving Update Advice Anymore
Please note this post is published under “Opinion” category and reflects the personal views of the author. If you disagree or have an opinion you would like to offer, feel free to discuss in comments!

Before I get in to it, I have to admit that this is, essentially, a conspiracy theory (though most inference from Google behavior has to be) and the title of this piece could easily have been ‘The possible reason Google is probably not, most of the time, giving advice about a large proportion of reasonably large updates’.

I want to believe

This is my personal Area 51 mystery (though I would say it is probably easier to penetrate the intentions of the US government’s Groom Lake base than those of Google), so I will concede immediately that this theory of mine (which I have been boring my colleagues at Click Consult with for a while) may not be true now – but it will almost certainly be true in the next few years. My theory? No-one knows what they are talking about.

Machine Learning

Over the last decade, machine learning has come on in leaps and bounds, and the big tech companies have played a large part in that progress by virtue of throwing money at it. But what has probably been the most influential reason for the rapid progress the industry has experienced has been the adoption of the neural network method of machine learning — the method that played a part in IBMs Watson winning at Jeopardy and Google identifying cats in unlabeled videos (harder than it sounds).

For a long time, it was assumed that AI, robots or your personal sci-fi silicon intelligence of choice would be programmed from the ground up by human, computer scientists, or puckish hacker geniuses that just won’t play by the rules. However, as with most things, what held us back was us. Even the towering intellects attempting to fully pre-program artificial intelligence could not successfully produce more than narrow intelligence – things which are little more than tools.

Neural networks, the computation method loosely based on the networks in animal brains, have allowed us to get out of our own way – enabling machines to learn through rapid iteration. Neural networks have no need for human teachers; they are a hodgepodge of interlinking pathways that do a job – and while we can judge how well they do that job (as, in my conspiracy, would be the role of Google’s ‘Quality Raters’), we are incapable, for the most part, of describing exactly how the job was done. As you can see in the diagram below – we can control the input, we can observe the output, but the actual computation is hidden.

a neural network diagram

To make an entirely too simple analogy, it is as though we can place the eggs, flour, milk, sugar, and butter in to a box and from that box we can take a cake some moments later – and while we can infer the process that led to the cake from the ingredients, we have no way of knowing for certain.

Here is a link to a video by a favorite YouTuber of mine, CGP Grey, whose video on machine learning is a great primer. However, all of this has been leading up to the concept that is at the heart of this conspiracy theory of mine.

Google Patents

If any of you follow the work of Bill Slawski – a former lawyer turned SEO and the go-to authority on Google patents – you will have watched the proliferation of patents granted to Google over the last five years that deal specifically with mathematical inference from user interactions and automated quality assessment. This includes patents such as the following on using CTR and other user interactions to improve search results.

In general, the subject matter described in this specification can be embodied in a computer-implemented method that includes determining a measure of relevance for a document result within a context of a search query for which the document result is returned, the determining being based on a first number in relation to a second number, the first number corresponding to longer views of the document result, and the second number corresponding to at least shorter views of the document result; and outputting the measure of relevance to a ranking engine for ranking of search results, including the document result, for a new search corresponding to the search query. The subject matter described in this specification can also be embodied in various corresponding computer program products, apparatus and systems.

This information combined with patents outlining the process involved in the creation of ontologies and entity detection to determine expertise and authority (something which may well have played a part in the apparent increase in the weighting of brand in a few of the latest updates), all provide Google with the necessary tools to allow a machine learning algorithm loose on the SERPs.

Whether or not it is the case now, it is inevitable that machine learning will at some point determine the results that people see. And, that the weighting of the potentially thousands of ranking factors it will have at its disposal will vary so wildly and so inconsistently across industries that it will be impossible for anyone to explain the exact methodology for determining rankings. Updates will come so regularly that even Barry Schwartz won’t be able to put out articles fast enough. The black box of the Google algorithm either is or soon will be, a black box even to the engineers working on it.

What This Means for SEO

The long and short of it is in the title – while Google has never been overly talkative around their algorithm updates, there was a time when we would at least get advice such as ‘thin and poor quality content’, or on ‘unnatural link profiles’ and while we may see announcements when a new factor is added to the mix or the weighting is manually altered – as with the shift to mobile or the speed update – the day to day running of the algorithm, and its near continuous refinement, will be increasingly opaque.

As such, the near fanatical obsession that many SEOs have with pinpointing, naming or defining fluctuations in SERPs will be increasingly futile (though that is not the same as analysis – group analysis will always be valuable, just not the race to spot and name a new update). Instead, we are going to need to pursue ever more finely tuned strategies for each individual industry – something which may lead to the further specialization of some agencies and an increasing trend toward large brands developing in-house teams.

The positive to take from this is that, while SEO may become more challenging to do well, it is going to ensure that a good SEO will be in some demand.

What This Means for SEO Techniques

The focus of the last few updates has, with reliable consistency, been pinned to the quality of a site – with the ‘expertise, authority, trust’ a good trinity to look to for the future of the industry. As such, there are things which will need to change, things which will become more or less important, and a host of differences between industries, but there are a few things that will be consistent in search, amongst which are:

1. Links Will Always be Important – But Where They Are From Even More So

The construction or establishment of ontologies is a phrase you will have no doubt come across – or will do with increasing regularity – and links will play a part in this. The definition of an ontology is: 

[A] set of concepts and categories in a subject area or domain that shows their properties and the relations between them.

In this regard, links will be the connective tissue between these categories and concepts, and you will need to ensure that the brands you work for or with are established within existing industry ontologies and mold them. For this reason, the importance of where a link is placed will need to be calculated in a new way. While, previously, the authority of a site would be paramount (often calculated by DA), this will become secondary to the relevance of the linking domain. 

Let's say you have a car parts auction website with content that is expertly written, beautifully targeted anchor text and the site’s authority is significant — if a link for the site is placed on a site that is predominately about baking, it will be less useful for the brand’s part in the overall industry’s ontology than one from a site half as well written, half as authoritative but which is focused entirely on car parts. The calculation of link relevance will begin to take in the relevance of the entire linking domain, not just the paragraph surrounding the link or the anchor text itself.

2. Your Expertise Will Always Be Important – So Create Expert Entities

Whether you choose to treat the overall brand or individual employees, products or services as the entity will depend on the brand, though I think a combination of all will likely be most successful. But these entities will provide the signals that your site communicates regarding your brand’s expertise and authority; this means that you will need to focus on building relationships with industry publications and the industry at large. 

By sharing knowledge and opinion – not just on your own site, but on those important to consumers and other members of the industry – you can build a network of brand or name occurrences that establish you as holding the desired authority to rank well for various search terms. Allow your staff members to build their profile in the industry, ensure your content carries a byline and, in turn, they can pass their authority on to your site.

entity creation

However, as stated, this is not just applicable to individuals - an entity is simply something about which you can inform Google and offer 'proof' about. As such you can build entities of anything within reason and establish their relationships to your brand and the industry.

3. Structured Data Will Become More Important While Machines are Learning

We know schema is important – while it is not essential at the moment, its inclusion is invariably helpful, and it will become increasingly important (until, at some point, like with prev/next, we realize it is no longer being used). 

The reason for this is that, as things stand, this kind of machine-readable information is incredibly helpful. While the algorithm will get smarter, it is going to rely on structured data for some time to help it parse the information we place online. For that reason, it will be important to ensure your brand is keeping up to date with industry appropriate mark-up. While the advice has always been to write for people rather than robots, the truth is that we need to do both. 

The ability of the Google algorithm to understand natural language far exceeds what it was capable of even after the roll-out of Hummingbird, but it is still imperfect – so it is important to use 'em' for emphasis and 'strong' for importance, to indicate which pieces of content have been written to be spoken, to indicate which content is an article and which is a product description. While none of these on their own are ranking factors, or likely to improve rankings in isolation, the process of making your content more easily parsed by the algorithm by implementing all of these changes certainly can.

The vocabulary schema has developed for communicating a wide variety of information clearly to machines is already extensive, but it is growing all the time. Every effort should be taken to ensure you and your site are up to date with the latest schema available in the industry in which you operate, as well as keeping up to date with HTML best practice.

In Summary

While the call may or may not be coming from inside the house, while it may or may not have been an inside job, what we can say for certain is that the Google algorithm either is, or will be at some point in the near future, be virtually incomprehensible except by inference from the job it is doing. Google may be able to steer it – and this is why the QRG should be a manual for digital marketers – but Google will not be able to offer advice about improving your rankings, that will be up to digital marketers who will need to analyze and compare notes to determine the best strategies for brands and their industries.

There are a limited number of techniques that will be viable across the whole gamut of industries, however, and among them are the establishment of ontologies, the creation of entities and the correct implementation of structured data. Everything else will need to be bespoke to at least each industry, if not to each client.

[This piece was adapted from a BrightonSEO talk given on the 12th April, you can find the slides that accompanied it on my Slideshare]

John Warner
Pro

Asks great questions and provides brilliant answers.

John is an internal marketer at Click Consult where he spends his time accruing industry certifications, tinkering with code, plotting strategy and writing articles on all aspects of search marketing.

While having a masters degree in creative writing and a background predominately in content, John has found himself drawn to the technical side of SEO – developing a passion for data, research, interactivity, artificial intelligence and voice search.

John occasionally appears on the blogs of various digital and marketing sites including regular articles on the Click Consult blog, contributions to quarterly search marketing magazine 'Benchmark', speaks at various search and digital marketing conferences and occasionally hosted the on.click podcast (and will again at some point - time permitting).
Share this post
or

Comments

2000
Hamza Ali
Pro

Asks great questions and provides brilliant answers.

Maybe I'm in the future, compared to when this got published. But this seems plausible...
Newcomer

Either just recently joined or is too shy to say something.

Google is a girl you need to share time with to discover her secrets, and when you think you know all about her, then she comes up with something like, Hey Fred! meet my boyfriend!
Matthew Brown
Newcomer

Either just recently joined or is too shy to say something.

That sounds entirely reasonable as an extrapolation of current and past technology. I would not be surprised if I learned that Google has been using AI in some way for quite a while already.
Igor Kholkin
Enthusiast

Occasionally takes part in conversations.

I don't think there's anything tinfoil about it. AI is not self-reporting, and Google's engineers don't know the algorithm in its entirety themselves. SEO is as much about testing & research as it's ever been.
Newcomer

Either just recently joined or is too shy to say something.

Honestly, I'm just happy to read something by someone who can write a well thought-out article with logical structure and decent reasoning. You sound like you have your head screwed on pretty well.

This alone makes me happy. Thanks. I was starting to give up, a little bit.
John Rampton
Pro

Asks great questions and provides brilliant answers.

Good points here.
Newcomer

Either just recently joined or is too shy to say something.

Great piece, love the intro hook haha.
Nick Samuel
Pro

Asks great questions and provides brilliant answers.

"Updates will come so regularly that even Barry Schwartz won’t be able to put out articles fast enough."

I wouldn't be so sure, Barry is already a MACHINE :-P
John Warner
Pro

Asks great questions and provides brilliant answers.

Nick Samuel
You might be right, there. Has anyone checked he's not a Nexus 6?
Tony
Newcomer

Either just recently joined or is too shy to say something.

Nick Samuel
lol so true. He was the first living algorithm ;)
Newcomer

Either just recently joined or is too shy to say something.

John, I think you’re right. As more and more of the algorithm is written by AI, Googles will know less and less about what’s behind the rankings other than “what people want.” Your conclusion that we need to “compare notes” is spot-on.
Newcomer

Either just recently joined or is too shy to say something.

informative
John Warner
Pro

Asks great questions and provides brilliant answers.

Umar Shah
Thanks Umar.
Anthony Godley
Newcomer

Either just recently joined or is too shy to say something.

Above anything, experience should be at the heart of the approach. Provide the user with the best experience possible and overtime the rank improvements will follow. And ofcourse, John is spot on. Structured Data & Link Acquisition rule the waves :)
John Warner
Pro

Asks great questions and provides brilliant answers.

Anthony Godley
I think that's Google's aim for the SERP - though I'm not sure it would correlate exactly with the site's they'll use for rich results, which is likely to be those that are simply best tailored for the variety of rich result.

However, I was looking through Statista data the other day and it was a little concerning how low on the priority list SEO was for the surveyed marketers - hopefully it means those who address the technical SEO requirements of the web will be in a better position.
Pardeep Kumar
Enthusiast

Occasionally takes part in conversations.

John, I agree with you. Authoritative link profile and high-quality content are the fundamentals of SEO and no one can rank without creating authoritative content.
John Warner
Pro

Asks great questions and provides brilliant answers.

Pardeep Kumar
Thanks Kumar!
Nick Samuel
Pro

Asks great questions and provides brilliant answers.

John Warner
...and a good technical foundation which is easy to crawl ;-)
Newcomer

Either just recently joined or is too shy to say something.

You are right! Anyway we will rule the SERPs. ;) :P
John Warner
Pro

Asks great questions and provides brilliant answers.

Elias Lange
One can only hope!

Send feedback

Your feedback must contain at least 3 words (10 characters).

We will only use this email to respond to you on your feedback. Privacy Policy

Thank you for your feedback!