The WP robots.txt file is a virtual file, created on demand. You can see the created file by entering a URL like this https://www.example.com?robots=1 .
The virtual file is generated and used if there is not an actual robots.txt file. If you have an actual one, the virtual file is not generated or supplied on request.
The default directives in the virtual robots.txt file are:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Sitemap: https://example.com/sitemap_index.xml
The sitemap directive is added if you have a sitemap file.
You can add statements to the generated virtual robots.txt file with the robots.txt filter:
add_filter('robots_txt', 'my_robots_commands', 99, 2); // filter to add robots
function my_robots_commands() {
$output = "* another command"; // add your additional directives as needed
return $output;
}
I use that filter to add additional commands to block the various AI site scanners from scanning my sites for AI use.
I wrote a plugin that does that chatbot blocking, but it’s been in the plugin ‘approve’ queue for 2 months now (along with 1200+ other plugins awaiting review). I ended up adding the code to another existing plugin of mine. But you can place the code in your child theme’s functions file.
Added
If the URL https://www.example.com?robots=1 does not show any content, it’s possible that your theme might have disabled the ‘robots.txt’ filter. A search for ‘robots.txt’ in your theme files (or even your plugin files) might indicate that scenario.