Knowledge Base
SEMrush Reports & Tools
SEO Toolkit
On Page and Technical SEO
Site Audit
Why were some of my pages blocked from the Site Audit crawler?

Why were some of my pages blocked from the Site Audit crawler? Question

There are a couple of reasons why pages would be blocked from the Site Audit crawler.

First, pages can be blocked by the Site Audit crawler if there is a gateway or user base area of your website. For example, if you run an ecommerce shop, there are certain parts of your website that the SEMrushBot or GoogleBot likely cannot crawl, such as payment gateways, user account information pages, or anything protected by a login. These pages would be blocked from any user agents attempting to crawl the site and therefore neither the SEMrushBot nor the GoogleBot would be permitted to crawl there.

The second reason why certain pages may be blocked could be because of the robots.txt file. Inspect your robots.txt to see if there are any disallow commands.

For more information about the SEMrushBot, go to https://www.semrush.com/bot/ and for more information about bots in general, refer to http://www.robotstxt.org/

Log in
Register

By clicking on "Create my account", you are agreeing to our Terms of Use.