Share
Generate a robots.txt file with crawl rules and sitemap.
Generate a robots.txt file for your website. Configure crawl rules for different bots, specify allowed and disallowed paths, add sitemap URL, and set crawl delay. Copy or download instantly.
Create a custom robots.txt file to manage how search engine bots crawl and index your website content.
Access the Robots.txt Generator tool
Open the tool interface to begin configuring your website's crawl instructions and bot permissions.
Configure crawl rules for specific bots
Select different user-agents like Googlebot and define specific allowed or disallowed paths for your site's directory.
Add your XML sitemap URL
Enter the full URL of your sitemap and set an optional crawl delay to manage server load during indexing.
Generate the robots.txt file code
Review your settings and click the generate button to create the standardized text format for your file.
Copy or download the final output
Instantly copy the generated code to your clipboard or download the file to upload it to your website's root directory.
Estimated time: PT2M
Share this tool
User-agent: * Disallow: /admin/ Disallow: /private/