November Happy Hour will be moved to Thursday December 5th.
November Happy Hour will be moved to Thursday December 5th.
Hi Sandeep,
You can add rewrite rules
<!-- Rewrite robots file requests for sites --> <rule name="Robots" enabled="true" stopProcessing="true" patternSyntax="Wildcard" xdt:Transform="Insert"> <match url="robots.txt" /> <conditions> <add input="{HTTP_HOST}" pattern="www.*.*" ignoreCase="true" /> </conditions> <action type="Rewrite" url="/Resources/Robots/robots.{C:1}.txt" /> </rule>
You will have
/resources/Robots/robots.abc.txt
/resources/Robots/robots.def.txt
so if user comes to abc.com/robots.txt it will be redirected to abc.com/resources/robots/robots.abc.com
this way it will resolve your physical file issue.
Cheers
Ali, Murtaza
For a robots.txt file there is an option to use the plug-in POSSIBLE.RobotsTxtHandler to allow this to be configured dynamically for each site in your solution, rather than using static files. This will present the site administrators with access to a text input field to manage the content of the file.
If the administrators of your sites have requirements beyond a simple text entry field then you would need to develop a bespoke handler, but in most circumstances the POSSIBLE.RobotsTxtHandler plug-in will suffice.
You don't need a physical file at all. You can ccreate a controller and map it to whatever URL you want. Then you can put logic inside the controller/action to render different contentent depending on the current domain/website.
Hey Sandeep
By the sounds of it POSSIBLE.RobotsTxtHandler will do exactly what you need (as Darren has described above). Just install it and go, no dev time needed apart from the install itself.
https://nuget.episerver.com/package/?id=POSSIBLE.RobotsTxtHandler
David
We have a website we are building which is deployed to the DXC and has multiple sites using different domains -
www.abc.com
www.def.com
...
and so on
All the sites use the same source code and we have created various start/home pages in the cms, and each site points to its own home page.
All seems to be working fine. We have however come accross an issue of actual phyiscal files in the code which need to be different.
We have to add the robot.txt file to the project but this needs to be unique to each site/domain we have created.
In a normal scenario, as this is a phyiscal file in the root folder it is quite simplet to add.
However in this case where we need to keep multiple versions it is not possible as we have one only common code base.
Could someone please advice the quickest way forward to add this at each domain level.
Kind Regards
Sandeep