Google is unable to crawl Robots.txt file of my website and please help me

Today I got a mail from Google webmasters, that Google bots are unable to find robots.txt file of my website.

This is not actually a problem (from crawling perspective at least, it might be for excluding things from crawling, which the file primarily used for). The robots.txt is neither mandatory on necessary for your site to be crawled.

I also tried to locate the file by http://www.mydomain.com/robots.txt but the file isn’t there. Instead the message is displayed as below.

This actually is robots.txt. 🙂 That is what rules in it look like.

If your installation doesn’t have physical robots.txt file than WordPress will be serving virtual one, using do_robots() to generate it.

It is hard to say why Google wasn’t to access the file (since you seemingly are able to). It might have been temporary issue that will clear itself, otherwise you might want to contact hosting and inquire if something might be interfering with Google bot accessing your site.