Mistakes happen to everyone, and this includes SEO teams. When you are dealing with massive websites, development changes, CMS issues, scripts, images, and plug-ins/add-ons while trying to integrate continual SEO and content strategies, there are going to be occasional errors. The good news — mistakes help everyone learn, and this applies to the SEO industry.
So how do we find SEO mistakes?
Technical SEO audits are used often to find errors that negatively impact rankings, conversions, and goals. In last week's SEMrushchat, we discussed how often to perform audits, common SEO mistakes, how to avoid these mistakes, which technical errors to fix first, duplicate content issues, and SEO best practices for title tags and meta descriptions.
We were very lucky to be joined by two brilliant experts in the SEO industry, Joe Hall and Joe Williams. Hall is the founder and SEO consultant of Hall Analysis with over 10 years of experience helping firms boost their online presence. Williams is the founder of TribeSEO and on a mission to make SEO easier for small businesses.
How Often Should You Perform Technical SEO Audits?
The first question we tackled with our experts was how often they carried out technical audits of clients' sites, and why. Both Hall and Williams agreed on yearly audits at a minimum.
Williams succinctly described his approach to technical audits using a car analogy:
Others explained why they perform more frequent technical audits, and other issues we should all keep in mind.
- January: site speed
- Feb: backlink analysis
- Mar: internal linking
this is, however, generally only useful for large, enterprise sites - but can be modified for smaller sites.
Everyone's situation, time allowance, and budgets are different; it is far easier for a company with a full in-house SEO team to run monthly technical audits than a mom and pop business.
If you are struggling with time and resources, you could always utilize the SEMrush site audit tool and stay informed on a continual basis about issues that arise. Then, every six months to a year, pay an SEO expert to audit everything to ensure your site is up to speed. With all the Google updates and changes, have an expert review your sites is essential.
Common SEO Mistakes and How to Avoid Them
Question two was focused on the most serious technical difficulties experienced with client websites, the impact on performance, and how they can be avoided.
For our SEO community, no indexing technical issues cropped up as a common problem, as well as sites being completely blocked by robots.txt.
Did I mention that each page had a 12MB picture of his face?
No rankings, no traffic, bupkiss.
Removed the noindex, fetched main site hierarchy in GSC and et voila, became an SEO magician in a matter of days.
There were some other SEO mistake stories that were shared in the chat as well, and there are lessons to learn from each of them:
Which Technical SEO Errors Should You Fix First?
When technical audits are done, there is often a lot of errors to fix, but how do you prioritize which technical errors to fix first?
Joe Hall's idea of ranking them by the level of High, Medium to Low risk is a great place to start. Joe William's mention of indexation would clearly be in the "high" risk category; the same goes for anything that violates Google guidelines or impacts rankings. Anything in the high-risk category should be addressed as fast as possible.
Other critical areas our participants mentioned were any errors that impact users and conversions. They also explained that they prioritize problems in terms of impact and time taken to fix an issue:
✅Which 1s are most likely to affect traffic & conversions
✅Which issues affect the most important pages
✅What fixes will have the biggest impact vs time spent
✅Time needed - if it will take 6 weeks to fix, start now!
1. Most Impact
2. Ease of Implementation
3. Time to Implement
How to Fix Duplicate Content
Duplicate content is a major SEO mistake that comes up again and again, based on a study that SEMrush conducted. In our study, approximately half of the websites experienced the problem. With this in mind, we asked our SEO experts if it was an issue they had faced and what they did to fix it.
The answer from both was a very clear yes.
Here are some duplicate content insights and solutions provided by our community:
What about a duplicate content penalty?
Best Practices for Title Tags and Meta Descriptions
Our last question looked at the best practices for title tags and meta descriptions. In the same SEMrush study we conducted, the results showed that almost a whopping 7 out of 10 sites had issues with missing meta descriptions.
So, we decided to ask our experts if all their site pages had unique title tags and meta descriptions, and what advice they had for generating them.
Joe Hall said, "It is SEO best practices to have unique titles & meta tags. Unique meta tags are probably a little less important, but I do think it helps. For title tags, they are so important that you are going to want to do them by hand...unless you have a specific meta tag strategy, you can automate them by leveraging your CMS. For example, many WordPress devs do this:"
<meta name="description" content="<?php echo wp_strip_all_tags( get_the_excerpt(), true ); ?>" />
Many felt that unique titles and meta descriptions can be unnecessary and inefficient, especially for large websites, but you can still focus on the pages that are important for reaching your goals:
Prioritize writing meta descriptions for the most important pages based on the time you have.
Here were some additional insights and advice given by our participants on optimizing title tags and meta descriptions:
We would like to take the opportunity to say a big thank you to our SEO experts and our helpful community for participating in last week’s #SEMrushchat and providing their insight on SEO mistakes.
Do you have some handy tips you would like to share on how to avoid SEO errors? We would love to hear them. Let us know in the comment section below. We hope you will join us tomorrow for SEMrushchat!