Try our conversational search powered by Generative AI!

Robots.txt files in an Enterprise Multi Site implementation


I have seen the EpiGoogleSiteMaps and going to look at it further after posting this.

 However in the meantime I was wondering if anyone had any neat soluions for generating the multiple robots.txt files one needs in an Enterprise Multi Site solution


Pat Long

Nov 25, 2008 22:06
Any luck with this? Having the same issue...
Dec 18, 2008 12:15

One idea (not tested but should work in theory at least... :-) is to create a "special" VirtualPathProvider for that purpose and register that only for virtual path "~/Robots.txt". In the implementation of that provider you could decide how to return the file, one option is to read it from a common storage (accessible from all sites in enterprise scenario) like database or some values from an EpIserver page called e.g. "Robot".   

Then there is no need to have any physical file Robots.txt at all.

Dec 18, 2008 17:11
ah, thanks, that's one idea, but i solved it using an httphandler that checks which site we're on and serves the correct file accordingly.
Dec 19, 2008 10:58
* You are NOT allowed to include any hyperlinks in the post because your account hasn't associated to your company. User profile should be updated.