Why does do_robots() Allow: /wp-admin/admin-ajax.php by default?

the truth is that probably nothing should be blocked in robots.txt by core (this is IIRC was joost’s position abot the matter) as wordpress is an open platform and front end content and styling might and is generated in all kinds of directories which might not make much sense to you and me. WordPress is not in the buisness of preventing site owners from installing badly written plugins.

Why do you have pages indexed by a search engine? WordPress uses a kind of “don’t index” headers for all admin pages so most likely you have some badly written code that prevents the header from bein sent. (this assumes that there is no bug in bing which is the SE which powers DDG).

Maybe worth reminding that robots.txt is just an advisory file, it is up to the search engine to decide if and how to respect it. IIRC google will not respect it fully if there was a link to a page which supposed to be excluded by robots.txt