This post will address additional questions from my recent SEMrush webinar, Data Mining with SEMrush: Unlocking SEMrush for the Serious SEO'er.
As a refresher, the webinar was all about using data to answer big SEO questions. The typical question might be, What is the whole competitor landscape like for a particular site? Another one might be, What is the aggregated back link profile like for 20 different domains?
Of course, you’re going to have your own questions that need answering, and this webinar was all about using a tool we’ve developed called 90 DataGrabber. This tools makes it possible to query Majestic's and SEMrush's API, and export the data into Excel.
In the webinar, I ended up talking quite a lot about engagement affecting rankings, which leads me to the first question.
1. Do a greater number of links and extra engagement correlate to higher visibility in search results?
Yes. But I’m sure you want to get more answer than that…
The way I see it, links are so easily gamed; it doesn’t make sense for Google to rely on them like they used to. When we talk about links, it’s probably better to think about PageRank because, ultimately, the aim of acquiring certain links is to increase your rank.
When I talk about click through rate, I’m talking about the number of clicks a search result gets for the position it is in, relative to other search results and the positions therein.
In other words, position 4 might usually get 6% click through rate. If there’s something remarkable or relevant, maybe that search result in position 4 will get 12% click through rate. In other words it gets 100% more click throughs than expected. Therefore, Google thinks something important is going on and they will reward these positive signals with an increase in rankings.
The other part is engagement. Google will have a pretty good understanding of bounce rates from one search page to another based on its data from search result pages. They can see if someone does a search, whether they bounce in and out of a number of results, or drop into a result and not come back out again.
Anyhow, back to the narrative.
Sites acquire PageRank and they get put up for audition. In this case, an audition is a ranking on a number of phrases. If a site gets above average user engagement for the search position it is in, then it’s seen as a positive signal and it will rise in the rankings until it finds its "sweet spot."
The sweet spot is where a search result gets the expected click through rate and engagement levels for that search results position.
A good example is with brand phrases. They get higher click through rate than other SERPs for their brand name and in the end Google will reward that site with a six pack search result. In extreme cases, you get the six pack and a number of other SERPs below from the same domain, although that is very unusual.
Simply, Google is saying you're getting the vast majority of click throughs and engagement; therefore, they're going to give you the most prominence on this particular search result.
In theory, if you ranked for a generic phrase and you got 90% of click throughs, you would end up with a six pack search result. This is why exact match domains can do this kind of thing.
If a site gets big influx of PageRank and gets bumped up the search results, i.e. it gets auditioned and it doesn’t get click-through rate and engagement, it’s going to fall down again. This may explain why spam sites are now having a much tougher time, because they get PageRank right but they fall over on engagement.
Of course, you can re-audition your website by getting another influx of page rank, but it’s going to be tough next time around. This is why fresh websites seem to rank so easily, because they don’t have a history. Google says, "We don’t know who you are, let’s give you a go." But if you do have a history, Google then says; "You’ve got history and it’s not great; you have to show that people love you through a big influx of PageRank."
The reason I think this is such a great way for Google to go is because it’s so hard to fake hundreds of thousands of user engagements over time. Experiments have shown you can affect rankings on a keyword by keyword basis by artificially stimulating click through rate and engagement, but that’s like keeping a website on a life-support machine.
Ultimately, this is a great thing for the Internet because Google is finally incentivizing site owners to create engaging and useful content, rather than link spamming.
2. Can you speak to local optimization?
Local SEO is a bit of a blind spot with me. It’s because I’ve generally concentrated on competitive verticals, which are international. However, saying that, if you’ve read the question about engagement being a ranking metric, for local search it’s the same dynamic as for anything else. If you have something people want to click on, and you have some decent links going into that site, then it will rank for its local phrases.
Obviously, with local, there’s a lot of housekeeping to do — getting on local directories, registering it with Google and so on. As you can see, it’s really not an area I know much about.
3. Would nofollow links still be a factor of Google ranking? For example, nofollow links from press releases.
It’s a subject I’ve debated on a lot. And the answer isn’t very linear.
I do think nofollow links help, but not because they're links. I think it’s because they’re seen as citations and if you read Google quality rater guidelines 2014, they make a number of references to third-party citations to a given website being important trust signals.
I just think nofollow links are part of the citation landscape and are a signal Google uses to determine the amount of trust gives a given domain.
Then we're into the question of where the citations should be placed. Wikipedia is a good example. If a site is mentioned in Wikipedia, it has to be a strong trust signal because Wikipedia is so heavily curated.
You mention press releases. The first question is: How trustworthy is the source of the citation?
It’s very easy to fall into that lazy cycle of using PR Web or other outlets where they imply that tens of thousands of journalists are eagerly waiting for your junk press release.
If you were Google, would you trust that press release as useful citation source? I would say, no. That’s why I don’t do any press release distribution, because it’s just filling the Internet up with more useless stuff.
Personally, I don’t bother with trying to chase nofollow links. I would rather concentrate on acquiring PageRank for a domain, and make sure the domain is very engaging.
4. What is TF1+?
TF1+ is trust flow one plus. Trust flow is a metric used by Majestic, which correlates extremely well with site visibility on search. Amongst the SEO community it’s seen as a core metric, much like Google PageRank, as shown on toolbars eight years ago.
If a site has trust flow of:
- 0 trust flow, it’s probably junk.
- 15 or more, it’s probably an OK website.
- 25 or more, this is a site with moderate authority.
- 35 or more and it is a strong website, i.e. a popular blog.
- 45 or more and you get into strong editorial websites.
- 55 or more and you’re into big established websites which have a lot of authority.
Essentially, Majestic has tried to emulate a number of signals that represent how trusted the domain is. They also have citation flow, which is like PageRank; that is, it’s a raw indicator for the number of links and their ranking power.
When you look at a website it’s always best to work with a mixture of trust flow and citation flow, i.e. generally speaking trust and citation flow should be on a par with each other. If there is a massive gap, something is not right. Either it’s got too many spam links going into it, or trust flow has been manipulated somehow.
Are you interested in learning more about data science and data mining? View "Data Mining with SEMrush: Unlocking SEMrush for the Serious SEO'er," a webinar with Nick Garner and SEMrush Marketing Director Michael Stricker.