Generate robots.txt files for your website. Control search engine crawling with allow/disallow rules and sitemap references. Free SEO tool.
Select which search engine crawlers these rules apply to. Use * for all robots.
Paths that search engines should NOT crawl. Leave empty to allow all paths.
Specific paths to allow even if a parent path is disallowed. Allow rules are more specific than disallow.
Full URLs to your XML sitemaps. This helps search engines find all your pages.
Number of seconds crawlers should wait between requests. Not supported by all bots.