SEMrush team would like to thank our users for giving their feedback on the Site Audit Tool. You are helping us a lot to make it better! We prepared a summary of how we responded to SEMrush users’ recommendations and suggestions.
• Similar to robots.txt, you can control what the Site Audit tool will crawl. Use Allow or Disallow (1) sections in the Site Audit settings to show us which directories should or shouldn’t be crawled.
• Now you can export the Site Audit data into CSV or XSL formats. Depending on what you decide to export from the Overview Report (2), you will get the list of errors, warnings, or notifications.
The Detailed Report (3) will show you the URLs of webpages where we’ve detected the specified issue.
• You can filter data in the Detailed Report for such issues as Internal links broken, External links broken, and Images are broken by the URL of the checked page, or one of external or internal links or images (4).
• Next to each URL of an error page in reports for issues "HTTP 4xx client errors" and "HTTP 5xx server errors", you will find a note, View broken links (5). By clicking, you will see which webpages are pointing to the given webpage returning an error status (6).
• Links pointing to Google+ are no longer considered broken.
• The presence of an <h1> tag is detected correctly.
• You can decide how many results you’d like to display on the page of the Detailed Report.
• External links on reports now all open in a new tab/window.
• All scheduled campaigns should now start appropriately.
Did we miss anything important? Let us know what you think we should add or change at [email protected]