Questions about Robots.txt

you can just make a file in your root named robots.txt if you don’t already have one User-agent: * Disallow: /myPage is the format for that one page there is no allow, only disallow. disllowing pages you follow the above format WP may not have generated a robots.txt, I believe it only would have if … Read more

Unwanted “crawl delay: 10” line added to my robots.txt

For those who are using WordPress as CMS for their site, you can bypass your web hosting server rules by simply removing your robots.txt file and instead modifying the virtual one generated by WordPress. You just have to add a filter to the functions.php file of your theme. Here’s the code snippet: //* Append directives … Read more

Stuck on my server root folder, robots.txt file not deleting

WordPress generate a dynamic robots.txt which does not physically exists. To remove/disable it you have two options: Option 1: Remove do_robots action in your theme functions.php or plugin remove_action(‘do_robots’, ‘do_robots’); The action do_robots is still available to be added again by other plugins. Option 2: Create a real robots.txt file, put it the root folder … Read more

Hata!: SQLSTATE[HY000] [1045] Access denied for user 'divattrend_liink'@'localhost' (using password: YES)