I have seen the EpiGoogleSiteMaps and going to look at it further after posting this.
However in the meantime I was wondering if anyone had any neat soluions for generating the multiple robots.txt files one needs in an Enterprise Multi Site solution
One idea (not tested but should work in theory at least... :-) is to create a "special" VirtualPathProvider for that purpose and register that only for virtual path "~/Robots.txt". In the implementation of that provider you could decide how to return the file, one option is to read it from a common storage (accessible from all sites in enterprise scenario) like database or some values from an EpIserver page called e.g. "Robot".
Then there is no need to have any physical file Robots.txt at all.