⚙️ Configuration

Quick Start

Choose a preset below or add custom rules. User-agent defines which bots the rules apply to, and paths specify what to allow or disallow.

Allow All

Allow all bots to crawl everything

Disallow All

Block all bots from crawling

WordPress

Standard WordPress configuration

E-commerce

Protect admin and user areas

No rules added yet. Add some rules above or use a preset.

Import/Export

Upload an existing robots.txt file to edit, or save your current configuration.

📄 Preview & Download

# Generated by Robots.txt Generator # Add some rules to see the preview

How to Use

  • User-agent: Specifies which web crawler the rules apply to (* = all bots)
  • Disallow: Blocks access to specific paths (e.g., /admin/)
  • Allow: Explicitly allows access to paths
  • Sitemap: Points crawlers to your XML sitemap
  • Crawl-delay: Sets delay between requests (in seconds)

Common Patterns

  • /admin/ - Block admin directory
  • *.pdf - Block all PDF files
  • /search? - Block search pages with parameters
  • / - Block everything (entire site)
  • /$ - Block only the homepage