Robots.txt not working for multilingual sites

Vote:
 

Hi Folks,

We are using “POSSIBLE.RobotsTxtHandler” plugin in our application. Robots.txt file is working for main site URL only like “https://****.com/robots.txt” but not working for cultures like “https://******.com/en-in/robots.txt” or “https://******.com/de-de/robots.txt”,and it takes us to 404 page.

We have a muiltisite solution and it is working fine for all sites, but for each regional site we need to manage it at regional level, Is it possible or how we can achieve this or any plugin which support this.

I followed couple of below links, but nothing works:

https://world.optimizely.com/blogs/giuliano-dore/dates/2020/10/how-to-create-a-simple-robots-txt-handler-for-a-multi-site-episerver-project/.

https://blog.nicolaayan.com/2018/11/a-simple-editable-robots-txt-in-episerver/

Please assit on this.

#286118
Edited, Aug 25, 2022 14:28
Vote:
 

Not sure about the Possible robotstxthandler but if you make no head way with that. Why not have a simple editable textarea string on a page thats common to all your regions? Homepage or example..then have your actionresult, regardless of what language its accessed in, iterate through all language variants of the homepage to build a complete robots.txt?

#286120
Aug 25, 2022 15:11
Vote:
 

I'm confused by your requirement 

Robots.txt files are not designed to be culture specific or multi-lingual. They should always be located at the root of your domain. Google (other search engines) will not read them from any other location, such as /de-de/robots.txt

https://developers.google.com/search/docs/advanced/robots/create-robots-txt

Because of the above the POSSIBLE.RobotsTxtHandler doesn't support what you're asking for. It does handle multi-site scenarios, where you have different domains and can serve different robots.txt file content per domain.

#286124
Edited, Aug 25, 2022 19:36
- Aug 29, 2022 11:37
Yes, I totally agree with you, POSSIBLE.RobotsTxtHandler is woprking for multi site perfectly for us as well.
But i am not sure what is causing this but, when we use any crawl tool to check all broken links, it is not working. We thought it might be due to that.
* You are NOT allowed to include any hyperlinks in the post because your account hasn't associated to your company. User profile should be updated.