robots.txt Not Updating

Normally, if there’s a WordPress file on disk, that’ll be served first directly by Apache or Nginx, before WordPress gets involved.

This is done in your virtualhost config, e.g. in Nginx you’ll typically find the following, which tells it to try actual files first before letting index.php handle the URL and generate a page on demand.

location / {
    index index.php index.html;
    try_files $uri $uri/ /index.php?$args;
}

So if your robots.txt is being ignored you may have something wrong with your webserver configuration.

I just tried those three plugins you mentioned with the twentyfifteen theme and everything was working ok. Yoast SEO lets you edit robots.txt from the Admin pages by the way (go to SEO > Tools > File Editor.)

If there’s no robots.txt file found, or control has been passed to WordPress, the default output is:

User-agent: *
Disallow: /wp-admin/

See wp-includes/functions.php to see how this works, but don’t ever edit core files.

This can be customised with actions and filters – for example the BWP Sitemaps plugin adds a Sitemap: line.

If you can’t find a plugin (or anything in your theme) that’s hooking into do_robots or do_robotstxt, this is a hack for your theme’s functions.php file that will probably do the job:

/**
 * Remove unwanted 'themes' from robots.txt with find/replace
 * @param string $input original robots.txt content
 *
 * @return string mixed
 */
function patch_robots_file( $input ) {
    return str_replace( 'Disallow: /wp-content/themes/', '', $input );
}

add_filter( 'robots_txt', 'patch_robots_file' );