Robots.txt Generator
Instruct search engine crawlers how to interact with your site.
Global Settings (*)
Add Specific Rule
What is robots.txt?
The robots.txt file is a simple text file placed in the root directory of your website. It tells search engine crawlers (like Googlebot) which pages or files they can or cannot request from your site.
Common Rules
- User-agent: Specifies which bot the rule applies to (* means all bots).
- Disallow: Tells a bot not to access a specific path or file.
- Allow: Explicitly allows access to a sub-folder that might otherwise be disallowed.
- Sitemap: Points crawlers to your XML sitemap for faster indexing.