Knowledge Base
Semrush Data & Metrics
How SemrushBot Crawls Your Site

How SemrushBot Crawls Your Site

A bot, also known as a web robot, web spider or web crawler, is a software application designed to automatically perform simple and repetitive tasks in a more effective, structured, and concise manner than any human can ever do. The most common use of bots is in web spidering or web crawling.

SemrushBot is the search bot software that Semrush sends out to discover and collect new and updated web data. A crawl process starts with a list of webpage URLs. When SemrushBot visits these URLs, it saves hyperlinks from the page for further crawling. This list, also known as the "crawl frontier", is repeatedly visited according to a set of Semrush policies to effectively map a site for updates: content changes, new pages, and dead links.

Learn more about SemrushBot.

Show more