Robots.txt Generator

Control how crawlers interact with your site and stop AI scrapers from copying your content. Use the form below to customize your default directives, then toggle the AI crawler blocklist to instantly generate a compliant robots.txt file.

Usually set to an asterisk to target all crawlers.
One path per line. Leave empty to allow all content for the default agent.
Override disallow rules for specific folders or files.
Optional. Ask bots to wait between requests. Not all crawlers honor this value.
List every sitemap you want bots to discover. One URL per line.
Add any advanced rules or comments you need to include verbatim.

Your robots.txt file