Robots.txt Generator

Build robots.txt rules, sitemap directives, host lines, and AI bot blocks from one server-side form.

Quick samples

Builder inputs

Choose an environment preset, then layer in sitemap, host, and path rules.

Allow crawling by default, then add targeted rules and sitemap lines.

Used to derive the sitemap URL and host directive when those fields are left blank.

Leave this blank to derive `/sitemap.xml` from the site URL.

Use a plain hostname, not a full URL.

Optional. Some crawlers ignore this directive, but others still respect it.

Add one path per line. Entries are normalized to start with `/`.

Keep sensitive or low-value sections out of crawler traffic without editing raw text manually.

Adds a dedicated `Disallow: /` group for common AI training and assistant bots.

Checks

Review crawler access, sitemap coverage, and host alignment before publishing.

Crawler access
The wildcard group allows crawling unless a path is explicitly disallowed.
Sitemap line
Add a site URL or explicit sitemap URL so robots.txt can point crawlers to your sitemap.
Host directive
No host directive is included.
Crawl-delay
No crawl-delay directive is included.
AI crawler group
AI crawlers inherit the wildcard rules because no dedicated block is enabled.

Generated output

Copy the entire file or just the sections you need.

User-agent groups

1

Directives

2

Sitemap line

No

User-agent: *
Disallow:

Wildcard group

Default rules for `User-agent: *`

User-agent: *
Disallow:

AI crawler list

Enabling the AI block adds a dedicated group for these user agents.

GPTBotChatGPT-UserClaudeBotClaude-Webanthropic-aiGoogle-ExtendedPerplexityBotCCBotBytespider