In building your site, you may use a robots.txt file at the root of your site to tell a web crawler which parts of your site it can capture. Similarly, a web page may contain noindex and nofollow rules to further instruct web crawlers that access your site. These tools can help you control how your site shows up in search engines.
Like a search engine, Silktide will crawl through your site to find all publicly available pages it can. But unlike a search engine, the goal of this crawl is to help you find issues across any page of your site. If a page on your site contains spelling mistakes or accessibility issues, we want you to know about it.
There are still many ways that you can control the content that Silktide scans within your website. Website settings allow you to declare which pages we should and shouldn't test. Robots.txt files can also be used to prevent your site from being overwhelmed by bots, and likewise we also provide tools to adjust the speed of Silktide's crawl on your website.