Loading tool...
Loading tool...
Generate a valid robots.txt file in seconds. Choose a preset or build custom rules for any platform.
# robots.txt generated by SerpNap Robots.txt Generator # Generated at serpnap.com/tools/robots-txt-generator User-agent: * Disallow: /admin/ Disallow: /api/ Disallow: /private/ Disallow: /*.json$ Allow: / Sitemap: https://example.com/sitemap.xml
Upload this file to the root of your website (e.g., https://example.com/robots.txt).
Google typically picks up changes within 24-48 hours. You can also submit it via Google Search Console → Settings → robots.txt.
A robots.txt file tells search engine crawlers which pages or files they can or can't request from your site. It's not a security mechanism — it's a crawl management tool that helps bots use your crawl budget efficiently.
Always place robots.txt at the root of your domain: https://example.com/robots.txt. Subdomain robots.txt files only apply to that subdomain.
Don't use robots.txt to hide sensitive pages — crawlers may still index the URL from external links. Use noindex meta tags or HTTP headers instead. Also avoid blocking CSS/JS files that Google needs for rendering.
New AI bots like GPTBot, Google-Extended, and ClaudeBot respect robots.txt directives. Use the "Block AI Crawlers" preset above if you want to prevent your content from being used for AI training while still allowing search indexing.
Related Resources
Ready?
Book a free 30-minute assessment. We'll map exactly which AI tools will save you time and money — with a clear timeline and pricing.