robots.txt Generator
Draft rules for bots: multiple Allow/Disallow lines (one path per line), optional crawl-delay, and sitemap URLs.
FAQ
Does robots.txt block hackers?
No—it is a polite suggestion to crawlers. Sensitive dirs need auth, network rules, and proper session handling. Attackers ignore robots.
Allow before Disallow precedence?
Some bots use first-match wins; Google documents longest rule precedence—read current vendor docs when mixing patterns.
Wildcard support varies?
* and $ endings behave differently across crawlers—test in Search Console for Google-specific behavior.
Sitemap placement?
Multiple `Sitemap:` lines are valid; ensure URLs are absolute https and return 200.
Crawl-delay ignored?
Google ignores non-standard crawl-delay; Bing may honor it—know your bot audience.
Staging sites?
Use Disallow all during QA but remember to remove before launch or you will de-index production accidentally.
Explore More Tools
Tap a tag to open a tool. The current page is highlighted.
