Robots.txt Generator
Build robots.txt rules, sitemap directives, host lines, and AI bot blocks from one server-side form.
Quick samples
Builder inputs
Choose an environment preset, then layer in sitemap, host, and path rules.
Checks
Review crawler access, sitemap coverage, and host alignment before publishing.
Crawler access
The wildcard group allows crawling unless a path is explicitly disallowed.
Sitemap line
Add a site URL or explicit sitemap URL so robots.txt can point crawlers to your sitemap.
Host directive
No host directive is included.
Crawl-delay
No crawl-delay directive is included.
AI crawler group
AI crawlers inherit the wildcard rules because no dedicated block is enabled.
Generated output
Copy the entire file or just the sections you need.
User-agent groups
1
Directives
2
Sitemap line
No
User-agent: *
Disallow:Wildcard group
Default rules for `User-agent: *`
User-agent: *
Disallow:AI crawler list
Enabling the AI block adds a dedicated group for these user agents.
GPTBotChatGPT-UserClaudeBotClaude-Webanthropic-aiGoogle-ExtendedPerplexityBotCCBotBytespider