Robots.txt Generator
Generate robots.txt for your website
Start
Rule 1
DISALLOW: /admin/
DISALLOW: /private/
User-agent: * Disallow: /admin/ Disallow: /private/ Sitemap: https://example.com/sitemap.xml
Common Paths
/admin/- Block admin area/private/- Block private content/tmp/- Block temporary files/*.pdf$- Block all PDF files/- Block entire site
Description
Generate a robots.txt file to control how search engine crawlers access your website. Create rules for different user agents, set crawl delays, and specify sitemap locations with an easy-to-use interface.
Key features
- Visual rule builder for robots.txt
- Support for multiple user agents
- Crawl-delay configuration
- Sitemap URL specification
- Preview generated file in real-time
How to Use
- Select a user agent (or use * for all bots).
- Add paths to allow or disallow.
- Optionally set crawl delay.
- Add your sitemap URL.
- Copy the generated robots.txt to your server root.
Example
Example robots.txt
User-agent: * Disallow: /admin/ Disallow: /private/ Allow: / Sitemap: https://example.com/sitemap.xml
This allows all bots to crawl the site except for /admin/ and /private/ directories.
FAQ
Where should I place robots.txt?
Upload it to the root directory of your domain (e.g., https://example.com/robots.txt).
Will robots.txt block pages from being indexed?
No, it only controls crawling. Use noindex meta tags to prevent indexing.
What does User-agent: * mean?
It applies the rules to all crawlers. You can specify individual bots like Googlebot.