Overview
Use it when the format needs to change, not the meaning
Use Robots.txt Diff when you want to compare crawler-rule changes before a deployment.
Robots updates
Compare a proposed robots.txt change against the current file.
Launch QA
Review crawler-rule changes before a site goes live.
Live import
Pull a public robots.txt file into either side and compare the normalized result.
Supported inputs
Bring clean source text and keep the direction straight
- Accepts two pasted robots.txt snapshots for directive diffing.
- Accepts a public URL or direct robots.txt URL for each import helper.
- Normalized lines ignore comments and whitespace-only noise.
Walk through it
Follow the same sequence you see in the tool
Workflow
Compare pasted snapshots
Use this flow when you already have both robots.txt versions copied locally.
- Paste the baseline robots.txt snapshot on the left.
- Paste the candidate robots.txt snapshot on the right.
- Review the normalized directive diff and copy the patch if needed.
Workflow
Import a live robots.txt file
Use this flow when you want to fill one side from a live URL before comparing.
- Enter a public site URL or direct robots.txt URL in the import helper above the textarea.
- Fetch the robots.txt content into the left or right side.
- Compare the two snapshots after the import completes.
What you get
Check the result before you copy it into the next step
Normalized directive diff
Robots.txt changes are shown after comments and whitespace noise are removed.
Import helper feedback
Fetch status messages help you confirm the live content before diffing.
Copy-ready patch
A normalized patch is available when you need to share the result.
Avoid these mistakes
Small input problems create the biggest conversion errors
Comparing raw comments
Ignore comment-only noise and focus on directive changes.
Testing private URLs
Use public URLs so the server can fetch them safely.
Skipping the import check
Confirm the fetched content before you trust the diff.
Glossary
Decode the terms before you act on them
This section translates the most technical labels on the page into plain language so you can interpret the output without opening another tab.
Directive
A directive is a crawler instruction line in `robots.txt`, such as `User-agent`, `Allow`, `Disallow`, or `Sitemap`.
Normalized diff
A normalized diff removes formatting noise such as comments or whitespace-only changes so you can focus on actual directive changes.
Comment-only change
A comment-only change edits explanatory lines that crawlers ignore, without changing the crawl rules themselves.