SEO Experiments That Will Blow Your Mind

In SEO, everyone has their opinions, but what I love about SEO is that at the end of the day, data wins the argument; either it worked, or it didn’t. That is why Will Critchlow of Distilled, and I got together to talk about some SEO experiments, hopefully, some surprising ones, and some you may have tried yourself. Some of these crazy experiments will blow your mind.

In our recent webinar, Will and I covered ten experiments — here, I will cover six.

I want to start with some examples of really common advice that you probably have seen on many blogs, if you have ever hired an SEO consultant, or if you ever worked in SEO, you have probably given this advice. There are also some made-up examples where the clients and customers aren’t real. These are equivalents of tests that we have actually run with some anonymized websites:

First SEO Experiment: What is a good click-through rate for organic search?

In order to measure this, I essentially use “time-lapse photography”. In this example, I was tracking 1,000 different keywords across all of last year, the same 1,000 keywords and I took snapshots in May, June, September, etc.

I found that the order of these keywords impacted the search rankings. In this study, the search rankings were changing from month to month in a very predictable pattern. The keywords that had higher click-through rates were floating to the top of the page, and the keywords that had lower expected click-through rates were falling to the bottom of the page.

Second SEO Experiment: Does click-through rate actually impact the rankings?

It is obvious that rankings impact click-through rates, but is there a feedback loop? Is the opposite true? Does a higher than expected click-through rate actually feed back into the algorithm somehow and result in a higher position?

We looked at relative organic click-through rates to the extent in which every keyword was beating or being beaten by the expected click-through rate. What we saw was that all you need to do is beat the average expected click-through rate for any given spot for 3%, on average, to be potentially be promoted into the next position.

Third SEO Experiment: Title Tag Optimization

In this example, the original title is the red one on the top below on the slide where it says "Guerilla Marketing: 20+ Examples & Strategies to Stand Out," which is the old way of doing it where you take the primary keyword and stuff it at the beginning of the title page. However, in order to make it more “clicky,” I rearranged the words to make it "20+ Jaw-Dropping Guerilla Marketing Examples and Strategies." By adding that action word "jaw-dropping" and then moving the number to the front, it had a profound impact on both click-through rate and rank.

Click-through rate optimization is interesting because even if you don't change rank, you are going to double your traffic because you are going to get more than your fair share of clicks for your given position. But, there are ways to increase your click-through rates without even changing your position, and if it does change position, which is often the case, it is almost like a double bonus.

CTR boosts are very, very valuable. Even if you don't believe my theory about CTR improving rankings, it is valuable because higher click-through rates mean basically more and more clicks.

Fourth SEO experiment: Do engagement rates impact rankings?

We know that Google measures dwell time because they used to have a feature where if you hit “back” very quickly you could actually block the site, and right now, it is the same thing on mobile - if you hit “back” quickly, it will say “People also search for.”

Google is measuring the time that it takes for you to click on something and return to the same page. Could they be using that information to say the search results are a sign of relevancy or non-relevancy? Sure enough, I was unable to measure the dwell time because that’s measured by the Google search page, and we are unable to measure other equivalent metrics such as bounce rates and time on site. If you have a very low bounce rate, then you are eligible to show in the top four positions. Conversely, if you have a very high bounce rate, you become less likely to occupy those top positions.

The discontinuity in the graph above suggests some kind of an algorithmic filter (as opposed to a naturally occurring relationship). If you have a decent time on site, you seem to be eligible to show up in the first six positions. Otherwise, you are relegated to position 7, 8, 9, 10, etc.

Fifth SEO Experiment: What’s the real relationship between social sharing and organic rankings?

For many years, Google denied that they use any social signals to rank pages, but if you have ever done any SEO in your life, you have probably noticed that your best SEO is rocking it in Facebook shares, too. There is a correlation, but it is not what you think, they are not counting the shares; instead, the search listings that people like to click on and have high click-through-rates also have very high click-through rates when shared on social media channels.

This is because the same emotions that make people want to click on things are the same emotions that make people want to share things on Facebook:

This graph shows click-through rates normalized by the position on the search results page, versus post-engagement rates for Facebook posts. As you can see, the content that does really well tends to do well on both, because both Google search and the Facebook news feed algorithm employ machine-learning systems that reward high user engagement with greater visibility.

Sixth SEO Experiment: How does time (bounce rate) impact SEO rankings?

In Google Analytics, you have a comparison report that tells you the time spent on page. This report shows you all of the pieces of content that are driving traffic to your site. In the below example, the timeframe used is 2015 (January to December of 2015). You can see the top pages for SEO. Most of them had above average time on page (those are the green bars), but, 1/3 of them had red bars, which means they were actually under the average time on page for the site average.

Let’s fast forward (October 2016 to January 2017); you can see that it has changed dramatically and that Google has found and eliminated all of the donkey pages. All of those pages that had below average time on site, Google has somehow managed to find them and eliminate them from contention among my top evergreen SEO pages. They have no business ranking on any of those keywords; they are vulnerable pages that either need to be fixed or they are soon going to disappear.

These are just a few of the experiments covered in my recent webinar. Some of the experiments mentioned above are similar to SEMrush’s recent Ranking Factors Study. While we have observed slightly different things, I believe we are looking at the same phenomenon from a different point of view. If you would like a more thorough explanation or to see many of the others completed, you can view them here.

If you would like to view my slides from the webinar, you can link to them here. You can also access Will’s by clicking here.

Interesting to see how click-through rates relate directly to social shares, even though they aren't linked together.
What I'm really curious about is how Google priorities these different factors. For example, a page that has great CTR via social shares but does poorly on SERP versus a page higher on SERP but little to no engagement on social. Obviously I feel like these details matter the most when you're already competitive, in the "big leagues", and ranking in the top 5. But it would be an interesting experiment in the future to eventually to pit these strategies against each other and test them across similar ranking pages.
Nicola Yap
I've actually looked at it this way and I can find nothing consistent from Google except that a site with good shares and visibility will do better. So a Pinterest pin can be very difficult to see without being logged in so those are weaker than the others but stonger than something set to private or members only. Whoever makes it the easiest for Google to read wins.
I know this worked on tests we ran on IMEC. However, CTR tests only work once you get a page to the first page and probably workin the top 5. How can you determine if a title or description is good/bad without having enough traffic seeing the actual title.

I feel links are the only way to get into the top placement for competitive terms and once you’re in the top spots, CTR can play an actual role.
It's easy to do experiments for bounce rate on the high traffic pages but trying to tweak the ones with lower traffic don't get enough hits to get the data you need. So I guess my donkey pages will stay ass for bounce rates until I get them enough hits.
The increased CTR in the Fifth SEO experiment is probably correlated a lot with the brand value of the website/company behind. If the posts of a specific Facebook page gets a lot of interaction, the reputation and brand of that page would be a major reason. The same way around, a well branded website will be much more likeliy to be clicked than some of the weaker brands in the search results.
Do you agree?

Btw, a really great and useful article Larry! I'll be much more likely to click something from you the next time ;-)
Larry, Amazing article! Thanks a lot
It's great to use some evidence of all this being true. I need an 'evidence repository' now, somewhere I can refer back to when explaining these results to clients! Does anyone have any suggestions? I often read a great article like this then forget where it was! I know Pocket!
Nigel Carr
I use good ol' bookmarks on your favorite browser!
Cheers, Larry. Love an experiment, and always a conversation starter!

For #3 (title tag optimisation), I'd love to see the surrounding SERP results, to see if sites at positions 1-7 (where position 8 was the measured website) were still present but at position 1-4 and 6-8. It would eliminate the possibility of other variables at play. Also putting the 4.2% CTR against the average CTR for position 4 across this search topic.
The Donkey pages analysis is interesting. Seems like a good place to start looking for underperforming content on a big site!
Simon Cox
the donkey pages are the best place to focus optimization efforts. you are simultaneously mitigating risk of failure while maximizing potential upside. the worst thing you can do is turn a unicorn into a donkey. but if you are doing experiments on donkeys the worst that can happen is that you convert them into another donkey (i.e. no change).
Hello Larry,
Wonderful piece of information, it should really help me a lot this year.
Jyoti Thapa
the coolest thing about these experiments (i think) is that they are pretty easy to replicate and try out for yourself!
Really great stuff. It’s way more natural now and I’m seeing the shiny light off the unicorn. Gonna need some shades for 2018.
Harris Brown
once people see how much greater the unicorn is than the donkey they will definitely need shades!
Larry Kim
Now I know why my daughter loves unicorns so much. This post went into my seo useful bookmark folder. Thanks for the knowledge.
There is a huge amount to take in here - SEO is now really all about the user experience as highlighted with RankBrain
yes i believe that SEMrush found similar results in their ranking factors study from last year -- that direct traffic (a proxy for great brand and user experience among other things) was most strongly correlated with search rankings.
nice experiment! informative article. thank you for sharing your report!
neel bhad
thanks for stopping by neel!
Thanks Larry, I am new to the forum and can always learn new methods to keep that edge in the marketplace and most importantly happy customers.
Frankly, I expected a lot more from this "Blow Mind"post by Larry. None of the experiments proves anything that's contrary to the general belief. Neither does it debunk any myths...
Anubhav Agarwal
i'm glad you feel that way. there are an unusually large number of extremely vocal, old-school SEOs that are reluctant to believe that things like CTR and Dwell time directly impact search rankings as i have illustrated here. if 11 people (at the time of this response) believe this to be obvious, then it means most agree. Personally while i have seen many people agree with the sentiments that CTR and Dwell time impact rankings, i haven't seen many published studies / experiments on the topic. so, my goal was just to make it easier to recreate experiments of your own!
Larry Kim
Gotta admit though the headline is a bit spammy! "Blow Your Mind" is a sensationalist phrase that almost stopped me reading what is an interesting piece - normally I wouldn't fall for such headlines but because Larry wrote it I was interested.
Larry Kim
Except, nothing here actually proves any of that. Sure, you can draw conclusions (and they are probably correct) but you do not properly prove causation over correlation, mainly due to the fact that your sample size is so small.

Take the bounce time for example; Yes, the data seems to show that over time Google favored the pages with the highest bounce time, but how do we know this is the only/ main reason for the rank change? Maybe you built more links to those pages, which is the sensible thing as they are performing better. Maybe the keyword research you did for those keywords was more relevant to your readers, leading to higher rankings & higher bounce time as they were a more interested audience.

My point is, you have written this "mind-blowing" article as if it a large-scale case study that disproves the general consensus & PROVES the points stated when, in reality, you have a very limited sample size, which has not (from what I can see) been controlled in any way.
honestly there is no way to prove anything 100% given that google doesn't publish their algos and the algos are constantly changing. but there is enough here to believe that there is something happening and as i have disclosed the test methodology it ought to be enough for you to try it out on your own site and either confirm or unconfirm the results for yourself. I will just add that i have tried this out on hundreds of pages across dozens of sites and am confident that it won't be a wasted effort on your part.
Larry Kim
You're right. You can't prove it, but you can run the experiment xxx times and control as many factors as possible, instead of giving us no other context. Did you do any work on the site during that time? New content, links etc? Were those KW's buyer intent?

My point is this post does not mention correlation vs causation at all and if this was a scientific study you would be laughed out of the room.
harry, SEO science is kind of a joke. ("science" should be in quotation marks). i'm afraid that the so-called science of SEO, all we have is correlations. (ranking factors, etc. they're all just experiments where we change an input and see what happens to the output). anyway i suggest just trying it out yourself, or if you're not interested then don't bother. it takes literally 5 minutes to update some crappy headlines.
Larry Kim
I was referring to a normal scientific study. Yes, I understand you cannot prove anything but how can you not see that your "mind-blowing" study has 0 context with it and only 1 example.

To be clear, I am not contesting your results, I am contesting the way you presented a tiny sample size without context or CONTROL as fact.
oh well the sample size is a thousand keywords/serp-listing pairs, not one. and i've replicated this on many other sites - i just couldn't fit it into a 1200 word post! like i said, it's pretty stupid-easy to reproduce these tests on your own or client sites -- or not, if you are too skeptical. that's totally fine!
Pretty much reflects everything I've seen, particularly over the last 12-24 months. It's not so much that "content is king", but more that a "good user experience is king". This "UX" starts with an appealing SERP snippet, flows into a page with valuable content and then a site which encouarges the visitor to cilck further into the website (UX = clicks (or CTR) + low bounce rate + high session time). The more time a searcher spends on your site, the less they spend on other ranking sites (usually).

Social shares, guest posting and other methods of linking produce additional traffic which all contribute to higher rankings. Add in high Brand Awareness, and you've got the basics of a great website that will rank for many search terms.
David Chadderton (Webspresso)
basically having "great user experience" is kind of ambiguous advice. to be more concrete, i would re-frame it to be: Having great CTR and dwell time is the key (since that is actually how google is determining "great" vs "poor")
Larry Kim
I agree that "good ux" is ambiguous ... I think the second part of my comment above clarified that.
Here's the diagram I shared with my clients last year that has been shared a little on social media > (hopefully this won't be deleted - it was shared recently by the SEMRush Twitter team)
David Chadderton (Webspresso)
Hi, David
I was keen to see your diagram but it opened with an error:

What happened?

The owner of this website (webspresso) does not allow hotlinking to that resource (/images/seo-explained-webspresso.png).
Excellent blog post.
Kate Forester
thanks for stopping by the SEMrush blog, kate! :)
wow nice experiment. Informative as well as valuable. helpful content. Thanks
thanks, hope you are able to replicate them in your own work.
Add a comment