Getting your website ranking on Google is hard enough. Improving your position on the SERPs can be more difficult, especially if you’re in a competitive space or are a very niche business.
However, if you have noticed that your site rankings are stagnant or aren’t moving anywhere, we’re here to help. We’ll answer some of the questions you may have and provide you with some tips along the way. Read on to learn more about the reasons your website rankings are being held back (and also what you can do about it.)
1. Poor Content Quality
Content quality is critical to your website rankings. You won’t rank on Google, or other search engines, without quality content that users need. If you’re just creating content that isn’t solving user’s problems, answering their questions, or providing any value, you’re just creating content for the sake of it.
Creating thin content or content that doesn’t fully satisfy what a user is searching for is a surefire way to see your website rankings remain low. If you’re going to provide your expertise on a subject, you need to show both users and search engines that you are an authority on the subject.
To create quality content, you need to analyze what others in the space (and on the SERPs) are writing. If you’re a local financial institution that offers subprime car loans and you’re looking to target the term “what are subprime car loans,” you need to tackle the research and content creation process in a logical way.
Content and Keyword Focus
We’re starting a little mid-way through the process here, so let’s take a step back for a moment. Your business may handle more than one area of lending, such as home financing. You wouldn’t create a landing page that is extremely general and about home and auto lending; your homepage would likely mention you do both, but your landing pages (or specified blog posts) would be focused on a single topic.
As mentioned above, you’re specifically trying to focus on “subprime car loans.” That means you’ve identified the keyphrase and are ready to move into how to build out the landing page or blog post you need.
PRO TIP: Already have a landing page and want to know its rankings on the SERP? You can use Semrush’s Position Tracking Tool. It’s simple to add keywords to track, and you can watch how it performs on the SERPs.
If you’re looking to improve your website ranking for the term “subprime car loans,” you need to see what other businesses or organizations are ranking on the SERP; these are your SERP competitors.
While we’re at it, let’s take a look at the SERP competition through our Keyword Overview Tool. for this keyword and what websites are ranking:
By doing this, we can see the website’s ranking on Google for this particular query. You can further this research by looking at the content that each of these sites has created and developing your content framework.
You can also run your chosen query through Semrush’s Topic Research Tool to see what headlines and questions are commonly asked. This may give you additional ideas for questions to cover and headers to use as you structure your content:
See what common questions are being asked and see what pain points each site has covered. Then, determine how you can answer these questions and why your business is the one that a user should turn to.
You don’t want to have too many pages about the same topics, as that can lead to keyword cannibalization, and you’ll end up with a confused Googlebot that has trouble determining which pages should be presented on a SERP.
Determining topics and target keywords for landing pages is the best way to structure your content (and also will improve your overall site structure, too.)
2. Unclear Search Intent
Search intent can be defined as the purpose of a user’s search. That is a general definition, but it can be broken down into four types of search intent:
- Informational intent: Users want to find more information on a specific topic, product, or industry. Example: “best coffee machines”
- Navigational intent: Users intend to visit a specific site or page. Example: “Nespresso coffee machine types”
- Commercial intent: Users consider a purchase and want to investigate their options. Example: “coffee machines comparison”
- Transactional intent: Users aim to purchase a product/service. Example: “buy a new Nespresso coffee machine”
If your business is a garden supply store, your site needs to provide content that users are searching for. For example, those searching for “garden supplies” or “garden supplies near me” could be simply looking for a store that sells those types of products.
Your site should have the right content and optimization for such terms while providing other valuable information. If the query moves to a more focused term such as “digging spade,” and you sell the product, you should have content that reflects your customers’ needs.
However, if you’re a gardening blog and you’re offering “best gardening tips” or “best plant care tips,” your strategy may be different. You would be identifying the informational intent a user is searching for. They want to find out different techniques or the best ways to plant tomatoes.
When it comes to ranking websites on Google, the search engine has repeatedly made minor adjustments (or more considerable algorithmic changes) to provide users with relevant content. In a Google Webmaster Hangout in 2020, Google’s John Mueller had this to say about analyzing SERPs:
This is something we do all the time as well. We do a/b tests in the search results all the time to see how can we make sure that we continue to provide relevant results, even when users’ needs and expectations continue to change over time.
So, by nailing down what your competitors are targeting, determining how you can approach the same query, and ensuring you’re providing content that both users and search engines are looking for, you can better determine the search intent of a query.
You can check out this piece that explains what search intent is and how it works, too.
3. Poor Quality Backlinks
Backlinks are the backbone of successful content. When you write high-quality content, you want backlinks as votes of confidence from other sources online.
Whether that be you’re a local business partnering with a non-profit and are looking for reputable backlinks to writing a how-to article that is being linked to by other relevant businesses, no one can deny the importance of (quality!) backlinks. Even back in 2016, Google has confirmed that backlinks are among the three main factors for website rankings.
To take a look at your site’s backlink portfolio, you can use Semrush’s recently updated Backlink Analytics Tool for a quick view. As an example, Let’s take a look at the backlinks that Target is getting:
From the main overview here, we can go into the backlinks report for more information:
PRO TIP: For a more accurate measurement of your backlinks, you can integrate your Google Search Console account with Semrush’s Backlink Audit Tool.
You can run your competitor’s websites through either tool to get a view of their backlink portfolio as well. It would be a good idea to take a look for a few reasons:
- You can see what sites are linking to them. Get an understanding for what types of sites to target and what authoritative domains they’re getting links from.
- You can see backlink types (such as text or images), research specific URLs, view data on new and lost links, and much more.
- When it comes to lost links, you can create a broken link building strategy to try to acquire some of them for your website.
Are you looking for a quick comparison between your domain and a competitor? Backlink Gap can help. Let’s take a look at Apple versus Samsung:
It’s important to know what your competitors are doing for backlinks and if there are any (there probably are) that you should look into trying to “take” from them.
Having a good backlink portfolio from reputable sources will only benefit your site in the long run; it will be viewed as more trustworthy and help you in your quest to improve your website rankings.
4. Underdeveloped On-Page Optimization
As we’ve discussed before, on-page SEO is key to making your site easily discoverable. You need to ensure that your page(s) are properly optimized for a target keyword, provide valuable information to users, and have fast loading speeds.
At a surface level, you need to ensure that your website and pages are optimized. You won’t be thrilled with the results of your website ranking on Google if your site is under-optimized for what your business does.
For keyword optimization, here are a few key spots on every page that you need to pay attention to:
- Meta title. If you want your meta title to fit within the SERP and not be cut off, it’s generally about 50-70 characters. However, Google’s Gary Illyes said that there’s no limit on meta title length and that the recommendations are externally created. So, write what you need to ensure Google understands your target keyword and what the page is about. So, choose what’s best for you!
- Meta description. Similarly, there is no “magic” number here. Although, most SERPs truncate meta descriptions to about 150-160 characters. That would be a good range for you to optimize your meta description and include a call-to-action for the user.
- Headers (H1, H2, H3, etc.). Your header tags are generally similar lengths to your title tags. Be sure to make them flow in a structured way while including target keywords as you work your way down the page.
- Content. Your content needs to be well-optimized for users and search engines. You need to stick to a primary keyword with a few secondary keywords that support your target. You need to ensure you have matching intent, as we just discussed above. If you understand what the SERPs are showing for a particular keyword or query, then you can better create content that helps users find what they are looking for!
5. Poor Technical SEO
Kevin Indig discussed technical SEO on our blog before, and it’s definitely an essential piece of the puzzle when it comes to your website rankings on Google. If your website is not technically sound, you could run into issues. Your representation on the SERP could be impacted whether it’s a crawlability issue or missing vital elements.
Below, we’ll tackle a few key aspects of technical SEO that need to be running smoothly for your site to be properly read and understood by search engines:
In our guide on building an SEO-friendly website architecture, we go over some of the main ways to create a well-organized website. An essential part of technical SEO is that your site is easily understood by Google crawlers; this makes it easier to rank your website on Google (given that you provide top-notch content, along with some of the other tips we’ve mentioned.)
Here is an example of a simple website architecture:
By properly grouping different areas from the home page, your site is better for users and search engines. Here is an example of URLs that have a clear path:
- Homepage: https://www.examplesite.com/
- Subdirectory/Subfolder: https://www.examplesite.com/blog/
- Filename: https://www.examplesite.com/blog/blog-post-1/
While this is a simple example, following a solid, consistent structure helps users more easily navigate your website and crawlers to better understand your site’s structure for ranking on SERPs.
Your internal linking strategy is also a good reinforcement of your website structure. You can link pages to each other to, yet again, reinforce your site’s structure for search engines and point users to valuable pages.
You can internally link using several different strategies, including:
- Contextual internal links
- Navigation links
PRO TIP: The Site Audit Tool also provides you with insights into your site’s crawlability, internal linking, and many other technical aspects that could be hindering your website’s rankings:
A crucial part of your technical SEO would be having your XML sitemap up and running correctly. Google’s Gary Illyes has said as recently as 2019 that XML sitemaps are the second most important source of URLs to be crawled by Googlebot.
And here was Gary's response in that same thread:
An XML sitemap tells a search engine about URLs on a website that can be crawled. So, you need to ensure your XML sitemap is done correctly so it won’t have a negative impact on your website rankings on Google.
Kevin Indig's post on our blog about XML sitemaps (and some of his favorite XML sitemap generators) is a great resource for more information.
Your robots.txt file is a way to inform site crawlers exactly which pages they should ignore. Google defines robots.txt files as:
A robots.txt file tells search engine crawlers which URLs the crawler can access on your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google.
Here is an example of what a robots.txt file looks like:
As stated, there are ways in which it may not keep a web page out of Google. For example, if another website links to a page you added to your robots.txt file, Googlebot may crawl the URL because it’s been discovered in another way.
It’s essential to check your robots.txt file to ensure that it’s not blocking important pages. Sometimes, if you notice indexation discrepancies in Search Console, it’s because of a robots.txt issue. Greg Gifford explains in our video below:
PRO TIP: Many people incorrectly assume that a robots.txt file can be used to prevent Google from crawling select pages; this is not true, as stated above. Your robots.txt file is a safety measure to ensure your server can handle requests from Googlebot. If you’re looking to prevent or remove a page from being indexed, check out this guide from Google.
Page Speed and Loading Time
With the recent Page Experience Signal update, Google has continued to move toward rewarding responsive websites and web pages. While much of the advice above has remained the same in terms of content quality, SEO, and search intent, we now have clearer guidance from Google on how sites should be optimized for speed.
Google’s PageSpeed Insights Tool offers a significant bit of well, insight into your site speed. You can run any URL and see both mobile and desktop performance. Let’s run Apple’s iPhone 12 landing page as an example:
Now, let’s take a look at desktop:
Should you panic about your website rankings on Google, then? No. Danny Sullivan gave a clearer idea of what Core Web Vitals and the Page Experience Signal update are all about:
It’s essential to have your site running at its best, but if your pages “fail” the assessment like Apple’s landing page did above, don’t fret. Just take the steps and recommendations given to try to improve your page speed.
These are just a few areas of technical SEO that you need to consider when taking a look at your website. By ensuring your site can be read and interpreted correctly, it only improves the chances of your website ranking on Google.
As for a more comprehensive approach to improving your technical SEO, be sure to read AJ Ghergich’s 15-step technical SEO audit.
PRO TIP: Also, for a fully comprehensive technical audit, Semrush’s Site Audit Tool provides over 120 checks from surface-level issues to more intricate technical issues. Plus, our Site Audit Tool offers a Core Web Vitals report. Check out our Core Web Vitals report blog post for the lowdown.
Improve Your Site & Track Your Website’s Ranking on Google
When it comes to your website rankings, there are many different reasons why they could be stagnant or not as high as you’d like them to be. With this guide and some of the other resources provided, we hope you can get started on making improvements to your website so it can perform better on SERPs.
Be sure to take advantage of the tools mentioned above, as they’ll help you fix and find solutions to some of the problems your site may be facing, and even check your website rankings on Google. Good luck!