Build a robots.txt file with a visual rule editor. Choose preset templates, add custom rules for specific bots, set crawl delays, and link your sitemap.
Preset Templates
Quick Add Bot
Rules
robots.txt Preview
User-agent: * Disallow: /admin Disallow: /private Allow: /
Generate meta, OG, and Twitter Card tags with live previews.
Create XML sitemaps from URL lists with bulk options.
Analyze keyword density and frequency in your content.
Format, beautify, minify, and validate JSON with syntax highlighting.
Convert text to camelCase, snake_case, kebab-case, and 7 more.
A robots.txt file tells search engine crawlers which pages or sections of your site they should or should not visit. It is placed in the root directory of your website.
Not entirely. Robots.txt prevents crawling but does not prevent indexing. If other sites link to a blocked page, it may still appear in results. Use a noindex meta tag for complete removal.
Crawl-delay tells bots to wait a specified number of seconds between requests. This can help reduce server load. Note that Googlebot does not support crawl-delay — use Google Search Console instead.
Yes. Adding a Sitemap directive in your robots.txt helps search engines discover your sitemap automatically without needing to submit it through each search engine console.