SEO Split Test Result: The Importance of Writing Effective Meta Descriptions

Koen Leemans

Jul 15, 20224 min read
SEO Split Test Result: The Importance of Writing Effective Meta Descriptions

Before you start: if you’re unfamiliar with the principles of statistical SEO split-testing and how SplitSignal works, we're suggesting you start here or request a demo of SplitSignal. 


First, we asked our Twitter followers to vote:

img-semblog

Here’s what other SEO professionals have to share about this test:

Lila H., Owner and Founder of BL DIGITAL:

It will be difficult to value the impact, since google can rewrite also the meta description to highlight the query of the user and serve better results. I would say positive for the check sign.

Trevor Stolber, Digital Marketer at STOLBER.com:

I think the after result will perform better but it will only be a small signal

Find out if our followers were right by reading the complete analysis of this test.

The Case Study

Optimizing meta descriptions is something that just about every SEO has devoted time and attention to at some point in their career. The purpose of the meta description is to inform and engage search users with a brief, relevant summary of what a particular page is about. 

Back in 2009, Google announced that meta descriptions and meta keywords play no role in Google's ranking algorithms. However, the meta description can affect a page's click-through rate (CTR) in Google SERPs, which can positively impact a page's ability to rank. 

Meta descriptions are written for users, and they significantly impact user behavior. That's why optimizing your meta description is just as important as optimizing your title tags.

Google can use text from the content attribute of a page's <meta name="description"> tag to generate a snippet in the search results. While there's no limit to how long a meta description can be, you'll probably want to keep it around 155 characters, as the snippet will be truncated in Google search results if necessary, usually to fit the width of the device.

Over the years, many articles have been written about creating the perfect meta description, but the truth is that the perfect meta description will be different for just about every website. That's why it's important to test different elements, such as CTAs (call-to-actions) or USPs (unique selling points), to find out what matches best with the user's search intent. OrangeValley wanted to test these elements for one of the largest e-commerce parties in the Netherlands.

The Hypothesis

The website in question had its category page meta descriptions set up like this:

img-semblog

For this test, we focused on two elements: the CTA and adding USPs. Based on SERP research, we hypothesized that switching from “Order your coffee machine from Blokker for the best price” to “Buy your coffee machine from Blokker for the best price” would better match the search intent of the user. We've also included "✓Free returns ✓Can also be picked up" as USPs to trigger the user to click on our search result.

The Test

We used SplitSignal to set up and analyze the test. 300 category pages were selected as either variant or control through stratified sampling. We kicked off the test and ran it for 21 days. We were able to determine that Googlebot visited 98% of the tested pages.

The Results

img-semblog

After 21 days of testing, we reviewed the results. We saw that the traffic to the variant pages performed better than the control group, which means the test is positive. Changing the CTA and adding two USPs resulted in a 6.5% increase in clicks to the tested pages.

After eight days, we were able to determine that this increase that we saw was significant. When the blue shaded area performs below or above the y=0 axis, the test is statistically significant at the 95% level. This means that we can be confident that the increase we are seeing is due to the change we have made and not to other (external) factors. 

Note that we are not comparing the actual control group pages to our variant pages but rather a forecast based on historical data. We compare this with the actual data. We use a set of control pages to give the model context for trends and external influences. If something else changes during our test (e.g., seasonality), the model will detect and take it into account. By filtering these external factors, we gain insight into the impact of an SEO change.

As mentioned before, meta descriptions are written for users and significantly impact user behavior. Google once said that users are often the best judges of relevance, so if a user selects a particular search result, it is likely to be relevant, or at least more relevant than the alternatives presented. This test shows that we are on the right track.

Analysis of the data shows that this test affected the click-through rate (CTR) on the pages tested. Compared to our modeled control group, rankings and impressions remained fairly stable. The increase in clicks seems to be purely due to the behavior of Google users. 

As an SEO, you need to think about and experiment with different elements that make up a meta description. Finding ways to stand out and be the most relevant answer to a search query is essential for optimal organic performance. For further testing, Blokker now has a new meta description for their category pages.

Due to the scale at which we are implementing this change, it is one of the most successful tests for Blokker. Keep in mind that something that works for one website may not work for another. The only way to know for sure is to test what works for you!

Have your next SEO split-test analyzed by OrangeValley Agency.

Share
Share
Author Photo
Koen is a Senior SEO Consultant at OrangeValley. He is specialized in technical SEO and SEO A/B testing. He takes the guesswork out of SEO, his advice and optimizations are based on what actually works. Koen is currently working with the SplitSignal team to make SEO A/B testing known and accessible worldwide.
More on this