Unwanted “crawl delay: 10” line added to my robots.txt

For those who are using WordPress as CMS for their site, you can bypass your web hosting server rules by simply removing your robots.txt file and instead modifying the virtual one generated by WordPress. You just have to add a filter to the functions.php file of your theme.

Here’s the code snippet:

//* Append directives to the virtual robots.txt
add_filter( 'robots_txt', 'robots_mod', 10, 2 );
function robots_mod( $output, $public ) {
    $output .= "Disallow: /wp-content/plugins/\nSitemap: http://www.example.com/sitemap_index.xml";
    return $output;
}

All you have to do is modify the $output with your own directives.

Leave a Comment