Configure crawler rules and get a ready-to-use robots.txt.
}
genRobots();
Presets included
- Allow all β let every crawler in
- Block all β hide from search engines (staging sites)
- Block AI crawlers β block GPTBot, CCBot, ClaudeBot, Google-Extended, and others from scraping your content
- Standard β allow crawling but block admin, API, and private paths
- Next.js / Astro / WordPress β framework-specific recommendations