Generate a robots.txt file with crawl rules and sitemap.
User-agent: * Disallow: /admin/ Disallow: /private/
Generate a robots.txt file for your website. Configure crawl rules for different bots, specify allowed and disallowed paths, add sitemap URL, and set crawl delay. Copy or download instantly.