November Happy Hour will be moved to Thursday December 5th.

Robots.txt files in an Enterprise Multi Site implementation

Vote:
 

I have seen the EpiGoogleSiteMaps and going to look at it further after posting this.

 However in the meantime I was wondering if anyone had any neat soluions for generating the multiple robots.txt files one needs in an Enterprise Multi Site solution

TIA

Pat Long

#26215
Nov 25, 2008 22:06
Vote:
 
Any luck with this? Having the same issue...
#26750
Dec 18, 2008 12:15
Vote:
 

One idea (not tested but should work in theory at least... :-) is to create a "special" VirtualPathProvider for that purpose and register that only for virtual path "~/Robots.txt". In the implementation of that provider you could decide how to return the file, one option is to read it from a common storage (accessible from all sites in enterprise scenario) like database or some values from an EpIserver page called e.g. "Robot".   

Then there is no need to have any physical file Robots.txt at all.

#26757
Dec 18, 2008 17:11
Vote:
 
ah, thanks, that's one idea, but i solved it using an httphandler that checks which site we're on and serves the correct file accordingly.
#26773
Dec 19, 2008 10:58
* You are NOT allowed to include any hyperlinks in the post because your account hasn't associated to your company. User profile should be updated.