Elena Terenteva

40 Technical SEO Mistakes — SEMrush Study

This post is in English

Performing a complete site audit is a complicated and ongoing process that includes dozens of checkpoints and diverse knowledge. This proves, once again, that creating and supporting a vigorous and prosperous website can be a difficult quest for beginners and sometimes even for experienced specialists.

There are no perfect websites without any room for improvement. Hundreds and even thousands of issues might appear on your website. So which issues do you really need to focus on first?

We decided to find some answers by conducting a study on the frequency with which SEO mistakes appear on websites. We collected anonymous data on 100,000 websites and 450 million pages using the SEMrush Site Audit tool to determine the most common on-site technical and SEO mistakes and issues.

We conducted the same study last year, but we only provided information on approximately 20 of the most common checks. This year we have decided not to concentrate solely on common mistakes, but to provide as much information as possible. So below you will see statistics on 40 checks, divided into three big groups:

  • Crawlability

  • Technical SEO

  • On-page SEO

Also, we have changed our methodology a little bit and added a new dimension -- issues’ severity level. Every issue is assigned a severity level from 1 to 5, where 5 is the highest. An issue’s severity level reflects not just the effect of certain mistakes on rankings, but also their impact on your site's overall performance and user-friendliness. We added this metric because some mistakes appear on websites very rarely, but they might have a high severity level, and we don’t want you to overlook them.

The data on the frequency and severity of issues will provide you with complete information about specific errors and will allow you to assess risks and evaluate the worthwhileness of your efforts during the site audit process.

And now let’s get to the point!

Although SEMrush Site Audit has 60+ checks in its arsenal, we have chosen the 40 above and excluded checks for HTTPS and hreflang implementation. If you would like more information on these exclusions, you can find it in our recent research studies:

13 Most Common Hreflang Mistakes - SEMrush Study

10 HTTPS Implementation Mistakes - SEMrush Study

While the above information is helpful, we have prepared even more for you! Last year our infographic was supported by a huge post that included descriptions of analyzed mistakes, but they were brief because it is hard to compress data on 19 serious mistakes  -- their impact on rankings and user experience, along with explanations of how to fix them -  in one blog post. So this year we are releasing a complete PDF guide: The Ultimate Site Audit with SEMrush, which includes information about the ‘nature’ of each of the errors discussed, their impact on your website’s health and tips on how to fix them. Stay tuned!

Nice article, can you check my website and see if it needs any work? [link removed by moderator]
Nice Article
Great Article !!
Love this article, good job!
Thank you great information!
nice post !! best information of Seo.
There is a negative SEO monitoring service we're building called SEO Defend [link removed by moderator] which tracks and alerts you of most of the SEO nightmares listed above.
Page speed has been a nightmare for sure! With all the "cool" high-tech components on the sites, the speed suffers a lot. And I think it's also one of the most challenging ones to fix, rather than just rewrite the Title or Description. Great study, thanks for sharing!
Martin Benes
A good CDN service can help immensely for page speed including optimize your image sizes automatically for mobile devices. A lot of people I speak with thought hosting on a big cloud provider would automatically solve the problem.

Disclaimer: I work for one of them
Martin Benes
One of the best tools I have found to diagnose how to fix speed issues (beyond just using a CDN which can be helpful, but is not a silver bullet) is with a tool like yellowlabs . I will admit it is often buggy in its ability to actually run, but when it does, the data it provides is really solid.
Thank you! I have got lots of ideas from this post. I hope these will work for me in my future work. Again thank you!
Thanks, with the help of this blog i will improve my ranking of website my removing these technical mistakes. This is my website [link removed by moderator], Pls give me more suggestions to improve my ranking.
nice post !
Elena - when I see the term 'site audit' I actually expect to be able to enter the relevant site's URL and for that site to then be put through an SEO analysis under expectable headings. Why do I not get this protocol when trying to use the SEMrush site audit tool?
Hi Elena! Thanks for that awesome infographic! Super useful (and shared!)
Also, I was wondering, could you please provide a definition of duplicate content?
Many thanks,
Maxime @mlquotes1
Hi Elena
We are using Business plan of semrush. On site audit it is showing 6091 pages with duplicate content issues. I have analyzed this is nearly correct. This is because we have variation product so the parent and child product has nearly same content.
Now for child products, , I will set parent product url as canonical in child products, then will this issue will be solved? Will this refer google to its parent product for the original content.
Is this the correct way to solve duplicate content issue, because its very hard to write unique copy of each similar items where the difference is only of size or colour.
Vikram Avasthi
Yes that's the correct way to deal with that. A good use of canonical tag by referring subsets of the same information to the parent information. Unfortunately this is just by product of the way ecommerce sites are structured.
İnfografik şekilde değilde yazı olarak eklerseniz daha iyi olur. ingilizcemiz olmadığı için çeviriden okuyoruz. ama infografi olunca anlamıyoruz.
This is awesome, thanks for doing this study! I feel like so many people get scared off by technical SEO because they think that is something for developers or their IT team, when in fact a lot of the things you can take of yourself easily. I included this in my weekly roundup post as well: https://www.kateneuens.com/writing-prompts-link-building-ebook-technical-seo/. Simple technical SEO fixes can make a big difference, I've seen it in the work I've done with clients.
Hi,do you do websites as well? I am looking for someone who does great ones.
Michelle Kong
Hi, Michelle! I hope someone who left a comment here could help ;)
Elena Terenteva
Many thanks.
Michelle Kong
Hi Michelle,
What are you looking for?
Michelle Kong
Hi Michelle, We ColorWhistle glad to help your requirement. If it is not too late, Can you elaborate your requirement!
Great post. The SEMrush site audit is a great tool and one for website owners to consider and take note of.
Kristian Stock
Thank you, Kristian, for a nice feedback! We appreciate it
Thanks for this great post. Usually, we always ignore those point but those are really important.
Md Rahman
You are welcome! My pleasure. Yes, small and simple things are very easy to overlook.
thank for your post this will save me some mistakes.
Jonathan Loiselle
My pleasure. Thank you, Jonathan!
Helpful post.It will help the beginners like me in order to increasing traffic in their websites.
Thanks for the post..
Saikat Roy Chowdhury
Thank you for your comment, Saikat!
Hi Elena, thanks for the post, we're currently getting an external agency to conduct an SEO audit of our site, it'll be interesting to see how it stacks up against the SEMRush tool.
Chris Leadley
Hi Chris! Thank you :) Best of luck with the audit, may the SEMrush power be with you.
He Elena - great post. I find that SEMRush does a really solid job at identifying ways to improve SEO. There are a couple places that I see gaps - 1) page speed 2) security. Both of which are becoming more and more important. In fact Google just sent out a bunch of notices today on moving sites to SSL. I've found a couple of competiting tools to be really solid in these area and they could give SEMRush some guidence on where to head. For page speed, I used webpagetest and gtmetrix. A recent find that I've fount to be excellent is the little known yellowlab dot tools. On the security side dareboost has a nice set of tools as well. SR team should take a gander.
Marston Gould
Hi Marston! Thank you so much for insights ;) We have a Page speed check - stats about it in the bottom of the infographic.
And by the way - talking about the security. We didn't include info about HTTPS implementation in this infographic, because we've made a research about it recently - https://www.semrush.com/blog/https-implementation-mistakes-case-study/
And we are checking websites for:
Non-secure pages with password inputs
Issues with mixed content
Internal links on an HTTPS site leading to HTTP pages
No redirects or canonicals to HTTPS URLs (excluding ones supporting HSTS)
HTTP URLs in sitemap.xml for HTTPS site
Expired SSL сertificate
SSL certificate registered to an incorrect domain name
No HSTS support
Old security protocol version
No SNI support

Take a look! And I'll definitely check tools you've recommended. Thank you!
Really Interesting post. Though most of the errors were kind of obvious but the percentage numbers along with them were super surprising. Thanks SEMrush for sharing these analysis.
Nitin Manchanda
Hi Nitin! Thank you so much for your comment!
Thanks Elena, this is a great post. Could be great to have some more data like for exemple about hreflang, structured data or even htaccess mistakes too ! Your infographics is a good SEO checklist to make a quick SEO analysis for a website. If completed with other SEO metrics, could be perfect !
Hi Tommy! Thank you! We actually made two similar researches on hreflang - https://www.semrush.com/blog/the-most-common-hreflang-mistakes-infographic/
and HTTPS implementation - https://www.semrush.com/blog/https-implementation-mistakes-case-study/
Check them out!
And yes - we are expanding the number of checks in our tool year by year and so I guess next time we are going to publish something really big!
Great post, love the Infographic too!
Emin Sinanyan
Thank you, Emin!
Great post Elena
surprising to see how many sites are still missing the basics.
404's, missing alt tags and missing H1 tags
Paul Lovell
Thank you, Paul! Surprising indeed. Well, I can imagine - if you are working on a huge website you have some more difficult stuff to handle and basics might seem insignificant. I don't know.
And another problem with basics - they require consistency and permanent effort - we all are always in rush, so it might be the explanation as well.
To me this emphasises the need to get your basics right. Over 80% of internal linking errors is because you're linking to 404 pages that need to be 301-redirected. Most web pages have too little unique content on them. Many title tags are too long and not unique enough. Lots of pages are missing H1 tags. Get those issues resolved and you're going to do a lot better.
David Bain
We all became too smart :)
This is really interesting and gives me relief that maybe my site isn't in such bad shape :) I have been working on cleaning up pages with shallow content and it will be great to go to stakeholders with "93% of sites have this same issue, let's be in that 7%" to use as competitive leverage
Andrew Meininger
Good job! Congrats :) Agree - be in lucky 7% is a nice achievement and definitely can motivate the team and make stakeholders glad.
I feel like a key factor is missing, that is actually important for technical SEO. That's .htaccess files.
Darien L
Thank you for your comment! We do understand the importance of .htaccess files, but for checking them (and providing info about frequency of .htaccess files errors) we need to have an access to the users’ file system, which is not possible. We agree that the mistakes related to redirect chains and loops and broken links, which we highlighted in the research can also be related to .htaccess configuration errors.
Did you just miss out on "Disallow: /" in production ? :D
Hi Boris! It’s a good point, but we didn’t take a look at this yet, cause we want to believe that less than 0.01% of websites would be affected by this issue:) Anyway, tendency shows that year after year we expanding the research, adding more checks to the tool and getting more data. So maybe next year we’ll include this one.
Elena Terenteva
They will never admit it !
I have personally experienced a team in an organization putting "Disallow: /" in the robots.txt for every website we owned (there were dozens). They did it in response to what they thought were unfriendly bots. They didn't notify me they were doing it, and I had no reason so suspect anything until I noticed our rankings and traffic start to tank. I was stunned when I found what had been done, and I moved to fix it immediately. We gained back all our lost ground, but not without a big loss during that quarter. A lot of people learned a lot of lessons, not the least of which is communication.
D Partridge
Wow! I broke my faith in common sense :) Seems like we definitely need to take this error into account next year.
D Partridge
Robotto is your best friend on this. I've been using them for 4+ years, saved me a couple of times. :)
Add a comment