Want the same success? Free trial here
Experience the full potential of SemrushLearn more
Your browser is out of date. The site might not be displayed correctly. Please update your browser.
From strain and local dispensaries directory to how-to’s on weed growth and finding the best local deal, Leafly is what you’d call an educational and informational hub on all things cannabis. Based in Seattle, Leafly attracts over 125 million visitors each year and features more than 4,600 marijuana retailers and a 5,000+ strain database. With over 4 million orders processed through the site annually, Leafly is a one-stop shop for both legal weed retailers and consumers that are looking for a “near me” weed store or want to find a bargain and purchase directly from Leafly’s site.
Leafly’s website has nearly 2 million pages, and its directory pages are all built on templates. This means that a change to a single page affects the rest.
Operating in a highly competitive landscape, the Leafly team had to ensure the ultimate visibility to each and every page so that they have both the rankings and the traffic (aka CTRs). The only way to do that is to run various SEO experiments, changing variables one at a time.
But SEO split-testing comes with a lot of challenges:
In Leafly’s case, they faced even more obstacles:
Leafly knew they had a lot of missed opportunities. First of all, they were wasting their resources on tests that could potentially have a negative effect. And this, in turn, meant that they weren’t spending time and resources on tests that would bring in a positive impact.
One way around all the challenges was to run control group testing. Yet with thousands of pages on the radar, this seemed like a mission impossible.
So Leafly turned to the split-testing power of SplitSignal - a tool designed by the Semrush team to allow SEOs go smart and efficient about SEO A/B testing.
“The general notion of controlling metadata in the cloud is so promising - it used to be a project impossible to pursue for a site with thousands of pages. And now it’s a mission possible”, says Stephan Cude, the Senior Tech SEO Manager at Leafly.
Leafly ran a test on 500 product pages (253 pages in the control group and 247 pages in the variant group) changing "marijuana" to "weed" in titles. In 21 days the test result was positive, increasing the number of clicks to the variant group by 2.7%.
One-by-one, each challenge they faced before was disappearing thanks to the tool:
SplitSignal allowed Leafly to easily run split tests without getting their dev teams involved. The only time their resources were needed was during the setup and for the site-wide implementation of the positive tests.
With SplitSignal, Leafly had full control over the scope of pages that were participating in the test. Moreover, they could control metadata on a URL level. So one change didn’t have to affect the whole directory, which, in the negative test result circumstances, implied a lot less impact to the rest of the site.
The tool gave great visibility in regard to each test performance.
While initially, Leafly’s SEO team would check each test’s results on a daily basis, with time, they’ve come up with weekly checks. As they were testing both local directories and strain databases - their most high-traffic pages - whenever they spotted a drop in either rankings or CTR, they simply stopped the test before it led to significant changes.
As they could save time and resources while running more tests, even negative results were seen as good as they indicated things that should not get implemented, and hence, delivered a more data-driven way to manage site-wide SEO.
Leafly ran a test in October on 248 pages. The test has run for 30 days. Here is an example of how this would look in a mock-up of SERP:
The notion of missed opportunities was lingering in the air before the Leafly team became capable of running a lot of split tests with SplitSignal. Now, the problem was not with lost opportunities but with prioritization.
As split test ideas come from everything from intuition to competitive analysis, Leafly has to find a way to create a hierarchy of variables to test - something they could only dream of before they were able to run split testing at scale.
By the end of November, Leafly had run over 20 full-scale tests that led to measurable - be it positive or negative - results.
Overall, they saw that the tool delivered on its promise: