Do you know what I think about when I look down on my city from a great height?
There are so many people generating huge amounts of new data on daily basis. Access to this data means knowledge and money. When it comes to SEO, data also means almost everything: starting from strategic planning and down to exact efforts. Everything should be data-driven to increase your chances of success. Although many marketers are aware of this simple paradigm, not many of them can call themselves a true master.
Today we are discussing how to become a master of data-driven SEO and what benefits it can bring with Samuel Scott @samueljscott, Director of Marketing at Logz.io, global speaker and blogger, as well as our other SEMrush chat participants.
Technical expertise is not only important when you’re working with a website; being tech-savvy means understanding the processes behind the program code or a product itself and creating content accordingly. What level expertise is appropriate for a marketer?
Samuel adds that if you run a large custom site, you need a deep knowledge of XML sitemaps, log analysis, international targeting and so on. If you run a small WordPress site, you need to know how to use and fully configure SEO plugins such as Yoast.
Jacques Bouchard @jacquesbouchard advises getting as much technical knowledge as you can: “You’ve got to start somewhere as an SEO, but the more technical knowledge you have, the better you’ll be able to consult your clients."
Joe Rega @JoeRega explains that an average level of technical expertise won’t take you far. An SEO should be able to diagnose, repair and explain any technical issues. By technical issues, we usually mean the interaction between SEO and source code/elements on a website. Without understanding the principles of their work, you can’t tell your client exactly why their site was penalized by Google.
— Barry Adams (@badams) February 17, 2016
Dagmar Gatell @DagmarGatell presents a list of typical technical tasks SEOs are in control of: “They optimize and structure hundreds of sites, go through testing, failures, success and stay up-to-date on technical innovations; they walk the walk.”
Let’s summarize the opinions mentioned so far: from the technical side, an SEO should know how to code, have analytical skills and be open for innovations.
Let’s open the average marketer’s toolkit and learn what instruments are vital to make SEO data driven.
As marketers we are already familiar with Google Analytics — a tool for monitoring website performance, engagement and conversions — so we’ve asked our participants to choose some tools apart from it.
Obed M. @MrClics adds to the list Google Console — with it you can understand exactly what queries people use to search for certain pages and also Google Trends that helps marketers to monitor topics that are popular amongst the audience now.
So, using tools is great, very helpful, but only if you understand exactly what data should influence your SEO efforts. In the next question we want to separate the wheat from the chaff and decide what data should be measured and what can be ignored.
Samuel says that there are much more important tasks than constant rank checking and following competitors.
He considers 4XX and 5XX errors to be the most important — if neither Google nor visitors can access your pages, nothing else will matter! Also in the server log files you can see which subdomains and directories are getting the most and least Google attention.
Behavioral factors are truly dominating these days, but other participants pay more attention to the technical aspect of their websites. Matthew Young @MatthewAYoung suggests that marketers pay more attention to duplication issues if you’re running a larger website, especially with parameters/faceted search.
Rachel Howe @R8chel_Marie describes the most important aspects as HTTP codes, canonical tags, robots.txts and sitemaps. These should be checked occasionally, especially if you update your site a lot.
According to our participants, to start with, SEOs should check page crawl errors, other on-page elements, landing pages and backlink profiles.
In the previous question, our participants mentioned server logs as one of the most important data marketers should gather. The next question will help us go into the details of the subject.
Server logs are files that are located on your web server and include information on each interaction between your website and both visitors and search engine robots. These files are usually stored as .txt documents, so no additional tools are needed to look through entries.
Dawn Anderson @dawnieando suggests her own checklist:
- Export the file into Excel or use Splunk or similar tools.
- Check your response codes.
- Determine which URLs are getting the love and shorten paths to those that are not getting any.
Also you can setup a cron job — do a grep on Googlebot and send the results to a .txt file and then have it mailed or imported to Splunk.
On the opposite side is Thom Craver @thomcraver, who is certain that regardless the log file analysis can show bandwidth concerns and technical sysadmin stuff it has very little for site audits.
Himanshu Suri @DM_Suri uses logs to find out the number of requests made by Baidu, BingBot, GoogleBot, Yahoo, Yandex and others over a given period time.
Summarizing all the opinions above, we can say that log files can help you identify site performance/usability issues and also reveal which URLs are more or less popular.
So, as you gather data, you need to implement it to improve the performance of the website. In the next question we will concentrate on using this data to encourage link development and social shares and citations.
From the point of content the data gathered from Google Analytics or Search Console, third-party tools and web server logs can help you create pieces that are interesting for both search engines and customers.
Netvantage Marketing @netvantage mentions that data can be very convincing when trying to procure a link or a social share. It also helps determine how your content is performing.
Amel Mehenaoui @amelm also mentions that data can help marketers understand users’ intent and behavior. They can use it to publish the right content and encourage link development.
The majority of our participants showed that data you gather through research should be used to check your organic visibility, page/domain authority, and link data, and to help expand upon your existing content.
A/B testing can support marketers when they are working on a landing page. To improve performance, users get divided into two groups and each group can see a certain variant of the page. If one shows better performance (a higher conversion rate), it will move into development and other will be eradicated. But what is the purpose of A/B testing in terms of of SEO?
The most common strategy is to test titles and meta descriptions in order to choose which are better from the point of engagement, and Samuel kindly points to that.
Matthew Young @MatthewAYoung thinks the purpose of A/B testing in terms of SEO depends on what’s being tested, but the outcome could be increased conversions, more visits, better engagement, on-page, etc.
According to Brandify Chat @BrandifyChat, A/B testing allows you to compare different organically visibility and engagement strategies.
Amel Mehenaoui @amelm Using GA Experiment can help improve users’ experience thus bringing more return visits, more traffic and better rankings through behavioral factors.
All in all, A/B testing can be handy when preparing meta descriptions, CTAs and text snippets. But remember, your goal is conversion, not rankings!
That’s it for today!
As always, we want to thank Samuel Scott @samueljscott and all our SEMrush chat participants for sharing their expertise with us.
Don’t forget to attend the SEMrush Chat next Wednesday!