There are a couple of reasons why pages would be blocked from the Site Audit crawler.
First, pages can be blocked by the Site Audit crawler if there is a gateway or user base area of your website. For example, if you run an ecommerce shop, there are certain parts of your website that the SEMrushBot or GoogleBot likely cannot crawl, such as payment gateways, user account information pages, or anything protected by a login. These pages would be blocked from any user agents attempting to crawl the site and therefore neither the SEMrushBot nor the GoogleBot would be permitted to crawl there.
The second reason why certain pages may be blocked could be because of the robots.txt file. Inspect your robots.txt to see if there are any disallow commands.