Skip to content

WordPress Robots.txt Generator

Free

Generate the optimal robots.txt file for WordPress. Block wp-admin, allow Googlebot, include your sitemap URL — ready to upload in one click.

wordpress robots txt generatorrobots txt wordpress seowordpress block wp-admin crawling
All SEO Tools

Settings guide

Always disallow:

  • ·/wp-admin/ — admin interface, not for indexing
  • ·/?s= — search result pages (thin, duplicate-like content)
  • ·/trackback/ — legacy WordPress feature, no SEO value

Always allow (even in wp-admin block):

  • ·/wp-admin/admin-ajax.php — required by many plugins and themes for functionality

Conditionally disallow:

  • ·/tag/ — if your tags are not keyword-focused, blocking prevents thin tag archives from consuming crawl budget
  • ·/author/ — on single-author blogs, author archives duplicate post content
  • ·/page/ — paginated archives; Google can crawl these, but low-traffic sites may benefit from restricting

Sitemap line:

End your robots.txt with Sitemap: https://yourdomain.com/sitemap.xml. WordPress generates a sitemap at this URL if you are using Yoast SEO, RankMath, or WordPress's built-in sitemap feature.

Format comparison

Robots.txt vs WordPress SEO plugin noindex settings:

Robots.txt controls crawling — it tells bots which URLs to visit. A noindex meta tag controls indexing — it tells bots not to include the page in search results. For low-value pages like tag archives, you have two options: disallow in robots.txt (saves crawl budget but the page can still be indexed if linked) or noindex in meta tags (page is crawled but not indexed). For most WordPress sites, noindex on tag and author archives is better than robots.txt disallow.

WordPress's virtual robots.txt vs a real file:

WordPress generates a virtual robots.txt file via PHP if no physical file exists at the root. An SEO plugin like Yoast overrides this virtual file with its own output. If you upload a physical robots.txt file to the server root, it takes precedence over both — giving you full control.

How it works

1

Enter your WordPress URL

Add your domain so the sitemap URL and canonical paths generate correctly.

2

Select WordPress-specific blocks

Choose which WordPress paths to block: wp-admin, search pages, tag archives, author pages.

3

Review and download

Get the complete robots.txt file with all WordPress-optimized rules and your sitemap URL.

4

Upload to your server

Upload the file to your WordPress root directory via FTP or your host's file manager, replacing any existing robots.txt.

About this format

WordPress generates a default virtual robots.txt that is functional but not fully optimized for SEO. It does not block crawling of wp-admin directories, does not specify your sitemap URL, and does not restrict unnecessary content like tag pages, author archives, or search results pages from being crawled and potentially indexed.

This generator produces a WordPress-optimized robots.txt that follows best practices: blocks all crawlers from `/wp-admin/` (while allowing admin-ajax.php which some plugins require), disallows low-value URL patterns like `/?s=` (search results), `/tag/`, and `/page/`, includes your sitemap URL for crawler discovery, and keeps all important content paths open to Googlebot.

WordPress sites have specific crawl budget concerns that generic robots.txt templates do not address. Thin content pages — pagination, tag archives, author pages on single-author sites, and search result pages — consume crawl budget without adding SEO value. Blocking these with robots.txt frees up Googlebot to spend more time on your actual content pages.

Frequently asked questions

Does WordPress have a robots.txt file by default?+
WordPress generates a virtual robots.txt file via PHP if no physical file exists in the root directory. This default file is minimal and does not block wp-admin or low-value pages. Installing an SEO plugin replaces the virtual file with a more optimized version. Uploading a physical robots.txt file to the server root overrides both.
Should I block wp-admin in robots.txt?+
Yes. The wp-admin directory is the backend of your WordPress site and has no value for search engine crawling. Adding Disallow: /wp-admin/ prevents bots from wasting crawl budget on admin pages. Keep admin-ajax.php accessible — some plugins require it for front-end functionality.
Where is the robots.txt file in WordPress?+
A physical robots.txt file lives at the root of your server, in the same directory as wp-config.php and the wp-content folder. Access it via FTP or your hosting control panel's file manager. If no physical file exists, WordPress serves a virtual one at yourdomain.com/robots.txt.
Will blocking tag pages in robots.txt hurt my SEO?+
For most WordPress blogs, tag archive pages have very low SEO value — they aggregate posts with thin connecting content. Blocking them with robots.txt or adding noindex tags directs crawl budget to your actual posts. If your tags are keyword-focused and have original descriptive content, they can rank — in that case, leave them accessible.
Do I need to update robots.txt if I install a new WordPress plugin?+
Only if the plugin adds new URL patterns you want to control. For example, WooCommerce adds /cart/ and /checkout/ paths that should typically be blocked from indexing. Review your robots.txt when installing major plugins that add significant new URL structures to your site.

Related tools and guides