Knowledge Base
SEMrush Reports & Tools
SEO Toolkit
On Page and Technical SEO
Site Audit
How do I whitelist the Site Audit so it can crawl my site?

How do I whitelist the Site Audit so it can crawl my site? Question

In order for SEMrush to crawl your website in Site Audit, our bot will need the necessary permissions. The way SEMrush is able to return information for a website is from crawling the pages through a bot. Some websites are different than others. Some websites will not have to add any permissions on their end and still be able to receive audits. However, some websites do have certain blocks put in place that block bots from crawling.

If you are receiving a message that tells you we were unable to crawl the site, this means there are changes that need to be made in order to give the crawler access.

First, you can go into your Site Audit settings and change the user agent to GoogleBot. This will use the user agent that Google uses to crawl websites which may work better for the domain you are auditing.

If SEMrush is still unable to crawl your site, you will then want to contact your webmaster and make sure that the bot's IP Address (46.229.173.67) is whitelisted.

If you need to specify the Port you will want to use one of the following options:

Port 80: HTTP

Port 443: HTTPS

To learn more, click here to our User Manual.

Log in
Register

By clicking on "Create my account", you are agreeing to our Terms of Use.