Well it depends on how/where they are hosted. If if they are hosted on different servers or the same I would still use the command line and a script file. If they all have different usernames/passwords you going to have to write a lot. Pre-planning will save you a ton of time in maintenance.
First it would be best to get some data on all the sites, for instance versions.
Here is an example of a script file for multiple servers,
http://www.cyberciti.biz/tips/execute-commands-on-multiple-linux-or-unix-servers.html
but in your case you would do something like
grep wp_version wp-includes/version.php
to get all the version info.
Of course if they all have diff admin passwords you would have to include each one.
An alternative would be to write a script to crawl the sites to grab the meta generator tags ( though they are sometimes disabled).
I would also do a mass backup using the command line and a similar script, possibly automated to a date based dir and using an sql dump.
You can then also update to the latest version for all 100+ servers using a command to run a script file.
The big drawback would by incompatible plug-ins/themes or if any core code was edited, this is why I would try and gather as much data as possible to avoid any issues with keeping them up to date.
I do not know of anyway to automate getting a list of active plugins though, so you might have to manually get some info and hopefully you have command line access to the sites.
It might take a while to set this and and perfect it, but the benefit is that you can tell your boss it takes 2 weeks, and that will leave you with 1.9 weeks of free time.