Overview
Use it when the format needs to change, not the meaning
Use Robots.txt Generator when you need a crawl rule file that is ready to copy.
Launch setup
Create a crawl file for a new or refreshed site.
Crawler rules
Add allow and disallow rules for specific paths.
Supported inputs
Bring clean source text and keep the direction straight
- Accepts path rules, host values, sitemap URLs, and crawler blocks.
- The generated file keeps the common robots sections together.
- Review the generated output before publishing it.
Walk through it
Follow the same sequence you see in the tool
Workflow
Generate robots.txt
Use this flow when you need a complete crawl file.
- Enter the path rules and sitemap data you want included.
- Run the generator to build the file.
- Copy the final robots.txt into your deployment or CMS.
What you get
Check the result before you copy it into the next step
Robots file
The text block is ready to paste into your site.
Sitemap line
The sitemap URL is included when you provide one.
Avoid these mistakes
Small input problems create the biggest conversion errors
Blocking the wrong path
Check each disallow rule before you copy the file.
Forgetting the sitemap
Add the sitemap line when the file should guide crawlers.
Glossary
Decode the terms before you act on them
This section translates the most technical labels on the page into plain language so you can interpret the output without opening another tab.
User-agent
User-agent identifies which crawler a block of `robots.txt` rules is meant to target.
Disallow
Disallow tells a crawler not to request matching paths from the site.
Allow
Allow is a more specific crawler rule that can reopen matching paths inside a broader blocked area.
Sitemap
A Sitemap line points crawlers to the XML sitemap file that lists URLs you want discovered.