Want to block all posts of spacific author via robots.txt file

You should use the robots_txt filter to exclude all of the URLs of posts written by specified author (untested): add_filter( ‘robots_txt’, static function ( $output ) { $query = new WP_Query( array( ‘author’ => 3, ‘posts_per_page’ => -1, ‘no_found_rows’ => true, ‘fields’ => ‘ids’, ) ); if ( ! $query->have_posts() ) { return $output; } … Read more

WordPress does not generate robots.txt

The WP robots.txt file is a virtual file, created on demand. You can see the created file by entering a URL like this https://www.example.com?robots=1 . The virtual file is generated and used if there is not an actual robots.txt file. If you have an actual one, the virtual file is not generated or supplied on … Read more

Can I remove these lines form my WordPress robots.txt

You can disable the robots.txt file by putting this code in an mu-plugin: remove_action( ‘do_robots’, ‘do_robots’ ); However, I don’t recommend that. It will make it harder for search engines to index your content, and it won’t really do anything to hide the fact that you’re using WordPress. There are a dozen ways that someone … Read more

Displaying the Virtural robots.txt file

The current/computed directives that WP generates for a virtual robots.txt file are not available with any function call or filter. The directives are only available with a page request similar to https://www.example.com?robots=1 . That page request will result in WP generating and returning the contents of the virtual robots.txt file, along with a Header. The … Read more

Is this robots.txt making Google not to show some of my content?

If you changed your site path, the urls previously indexed by search engines are no longer valid. Google Webmaster Tools has a change of address tool: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=83106 In general, you need to : setup 301 redirects for your old urls create a new sitemap be patient

Hata!: SQLSTATE[HY000] [1045] Access denied for user 'divattrend_liink'@'localhost' (using password: YES)