We are using “POSSIBLE.RobotsTxtHandler” plugin in our application. Robots.txt file is working for main site URL only like “https://****.com/robots.txt” but not working for cultures like “https://******.com/en-in/robots.txt” or “https://******.com/de-de/robots.txt”,and it takes us to 404 page.
We have a muiltisite solution and it is working fine for all sites, but for each regional site we need to manage it at regional level, Is it possible or how we can achieve this or any plugin which support this.
I followed couple of below links, but nothing works:https://world.optimizely.com/blogs/giuliano-dore/dates/2020/10/how-to-create-a-simple-robots-txt-handler-for-a-multi-site-episerver-project/.
https://blog.nicolaayan.com/2018/11/a-simple-editable-robots-txt-in-episerver/Please assit on this.
Not sure about the Possible robotstxthandler but if you make no head way with that. Why not have a simple editable textarea string on a page thats common to all your regions? Homepage or example..then have your actionresult, regardless of what language its accessed in, iterate through all language variants of the homepage to build a complete robots.txt?
I'm confused by your requirement
Robots.txt files are not designed to be culture specific or multi-lingual. They should always be located at the root of your domain. Google (other search engines) will not read them from any other location, such as /de-de/robots.txt
Because of the above the POSSIBLE.RobotsTxtHandler doesn't support what you're asking for. It does handle multi-site scenarios, where you have different domains and can serve different robots.txt file content per domain.