Thomas Swisher

How to Break Down Your Competitor's Internal Link Strategy

Every day thousands of new websites are created. If you want to be able to compete, it is important that your on-site SEO is complete. One of the most important components of on-site SEO is internal links, and it is often overlooked. It affects everything from user experience to how search engines discover and view your content.

In my article “Why Is Internal Linking Important” I demonstrated how Wordstream used internal linking to rank one article for over 100 keywords including the very competitive term “link building.” It is easy to assume that the Domain Authority (DA) was the reason this article ranked well. In my article, I demonstrated it wasn’t just DA but the use of internal linking. "Link building" isn’t very relevant to PPC or the core theme of Wordstream’s website.

Some highlights from my article:

  1. Approximately 24 posts (linking to the main article) with over 30,000 words written around the keyword “link building.”
  2. 22 of 24 articles originally posted between 2009-2011.
  3. All posts show “date modified” in 2015 even though very little if anything was changed.
  4. Keyword “link building” found in most articles URL, title tag, and H1.
  5. The link itself usually had “link building” as the anchor text and was found in the first 100 words of the article. 

In my study, I showed how Wordstream systematically targeted the keyword “link building.” In this article, I’m going to show you how I reverse engineered Wordstream’s internal link strategy using Screaming Frog and SEMrush.

How to Find an Articles Internal Links

Screaming Frog is a website crawler that gathers key on-site data. It is one of my favorite tools. If the website is small you can use it for free otherwise you will need the paid version.

Crawl the Website

Let’s begin by entering the URL of the website we want to crawl. It can take anywhere from a minute to over an hour depending on the size. 

Tip: Since we only need the URLs of the website we can configure it by unchecking CSS, Javascript, SW, and check links outside the folder. You can find these settings under “configuration>spider.” Uncheck what isn’t needed.

How to configure screaming frog.

Extract Internal Links

Once the crawl is finished, we need to find the URL we are researching. The quickest way is by entering the end of the URL into the search box. I’m searching for “link-building.” 

Once you find it click on the correct URL, change the filter to “HTML” and at the bottom click on “Inlinks.” Here you will see all the internal links for the URL you have selected.

How to find internal links.

Next, we need to export them. Just right click on the URL. Choose “Export>Inlinks.”

How to export internal links.

Clean up the Internal Link List

Once we have extracted the internal links, we are going to need to clean it. Depending on the size and design of the website there could be duplicate URLs because of things such as taxonomy pages and tracking parameters. 

I’m going to upload the file to Google Docs and use the filter function to put the URLs in alphabetical order. Then any duplicates or taxonomy pages will be grouped together so I can remove them.

Next, I use the find function Command+F (Mac) Control+F (Win) and search for special characters (?) and remove those URLs. Now I’m left with a list of the internal links and the anchor text used to link to the main page.

Exported list of internal links.

Study the On-Site Optimization

Once we have a list of the internal links, we need to decide on what on-site factors we want to review.

 Here is a list of criteria that I focused on in my study.

  • Keyword present in the URL, title tag, H1 and anchor text?
  • Is the link first on the page?
  • Is link found in the first 100 words of the page?
  • Date the article/post was published and if it has been updated.
  • The length of the article.

I took the Google Sheet that I created and added a row for each of these items, reviewed each article and recorded the results. 

Here is my worksheet along with other criteria that I looked at such as referring domains.

On-site optimization worksheet.

How To Use This For Your Website

How you use this depends on if you have a new website or a site that already has a lot of content.

A New Website

If your site is new, you can use this process to gauge the strength of your competitor’s content. It can also be used to generate content ideas.

First, choose the page you want to target. Run the pages through the process listed above to find all the internal links and how well they are optimized.

Next, you want to do a backlink audit main page and all internal linking pages. The audit will help you understand how much work it is going to take to compete with the main page. It will also give you ideas for securing links once you have created your content.

You can also use the main page to generate content ideas. I’m going to use Wordstream from my example above and SEMrush to generate a keyword list.

A paid SEMrush account will be needed. Just enter the domain. Make sure you are on the Overview dashboard under Domain Analytics. Go to the “Top Organic Keywords” section and export the list.

How to export SEMrush keyword report.

Next, open the spreadsheet with Google Docs or Excel and filter by the URL of the page you are researching. This will give you the keywords (search terms)  associated with the page that SEMrush has identified along with other information such as position, volume, etc.

This information can be used to build a keyword list based on your competitor’s content. I would suggest looking at the top 5 pages that rank for the keywords you’re targeting.

An Existing Website

If you already have an existing site with plenty of content around the keywords you are targeting, then the process is much different.

Once you have your main keyword, you need to decide what piece of content will be your cornerstone content or create one from scratch. I would begin by using SEMrush to export all your keywords as we did in the image above.

Next, open it in Google Docs or Excel and see if you have a page already ranking for your keyword. If you have a page already ranking for the keyword, you want to target then consider updating the page and using it.

Another way to find content on your site that search engines associate with your keyword is to use the operator without the brackets in your search bar. This will return pages associated with your keyword. This is also the process for finding content on your site to internally link to your cornerstone content.


Internal links are one of your most important links. You have control of this link. You choose the anchor text and where the link points. These links tell search engines what content you see as relevant to that anchor text, and it helps visitors navigate your website.

One final note. As all SEO techniques if you abuse it you will diminish the effectiveness and set yourself up for a possible penalty.

Very Useful Insight Thomas :) Thanks for the writeup. A well organized strategy always helps a website to gain visibility, traffic and rankings. Completely agreed with your post and I must say Both Inbound and Outbound links are useful and helps a webpage to gain or either lose if done in a wrong way.
Raviraj Tak
Thanks Raviraj I'm glad you enjoyed it.
Thanks for this informative article. now i got why the people following SEmrush.
Great work.
abrar ansari
Thank you Abrar.
Thanks for sharing this article with us but I believe that both links are mandatory outbound links and as well as inbound links. Internal links has there own importance whereas quality external links can increase your websites credibility.
Hey Thomas, the article is good. Learned how to scrape internal link via screaming frog. I would still need your help in understanding the internal linking scraping strategy for big websites. It takes way way long for screaming frog to scrape even with all the possible exclusions. Is there any way/tool/strategy to automate/make it easier? Appreciate your help in advance.
Payal Bhalodwala
Hi Payal I haven't had a site that I couldn't use Screaming Frog. I haven't worked on any huge sites. The larger sites I have worked with I just crawled the subfolders that I was interested in. I'm not aware of any tools that will automate this process. I do it manually with spreadsheets.
@thomas great research you've done. Looking forward to learn from your experiences.
Deepak Mathur
Thanks Deepak.
It's amazing. Thanks for sharing.
Thanks for sharing this type of info. Really useful blog.
Thomas. Great article
Think I need your help with internal link structure, and keywords juice, to improve our organic traffic.
Basil Goldman
Hi Basil, glad you enjoyed my article. I would be happy to help. You can reach me at
Add a comment