Disallow: /wp-* in robots.txt?

It seems to be a WP default setting, as many Webmasters have gotten this warning and never edited the robots.txt. Removing all the disallows is the easiest solution, but I assume you want some or all of those directories blocked.

Google is only concerned about the .js and .css files, so you could in theory edit the robots.txt to include:

User-Agent: Googlebot
Allow: /.js
Allow: /

However, being that specific could require future changes to the user agent, in case more search crawlers follow Google’s example.

You want to make sure you know how robots.txt work so you don’t accidentally block your entire site or important sections. Here is a good reference for more details about robots.txt: