You already know the Site Audit tool is an effective way to check broken links and images, missing and duplicate tags, error pages, and much more.
But did you know that the Site Audit tool now checks new issues regarding robots.txt files?
Does your website have a robots.txt file? Is it in the correct format? The Site Audit tool will tell you for sure.
SEMrush will go straight to the root of the domain to check for a robots.txt. If one of these files isn’t found, you will receive a notice that will read “Robots.txt not found” (1).
If your website doesn’t have a robots.txt file, you should consider using one in order to control what search engine robots can see on your website. By listing rules in a robots.txt file, you can forbid malicious bots from accessing your website; block robots from crawling folders and files containing sensitive information that shouldn’t be seen publically or unprepared webpages to prohibit them from appearing in the SERP (2); prevent your website being overload by indicating a crawl delay (3); or even avoid duplicate content issues.
If your website does have a robots.txt file, SEMrush will determine if it conforms to a standard. If it doesn’t, you will receive an error message reading “Robots.txt file has format errors” (4). In this case, you should revise and correct the file to ensure that your site’s robots.txt commands are in the right format.
Moreover, the Site Audit feature will inform you if the robots.txt file isn’t pointing to a sitemap.xml file on your website. If you get a warning that reads “No link to sitemap.xml found in robots.txt”, you should indicate your XML Sitemap’s location, according to SEO best practices (5).