SaaS CMS has officially launched! Learn more now.

DXC - Different physical files for multiple sites using same source code - robots.txt


We have a website we are building which is deployed to the DXC and has multiple sites using different domains -


and so on 

All the sites use the same source code and we have created various start/home pages in the cms, and each site points to its own home page.

All seems to be working fine. We have however come accross an issue of actual phyiscal files in the code which need to be different.

We have to add the robot.txt file to the project but this needs to be unique to each site/domain we have created. 

In a normal scenario, as this is a phyiscal file in the root folder it is quite simplet to add.

However in this case where we need to keep multiple versions it is not possible as we have one only common code base.

Could someone please advice the quickest way forward to add this at each domain level.

Kind Regards


Edited, Jun 15, 2018 1:10

Hi Sandeep,

You can add rewrite rules

  <!-- Rewrite robots file requests for sites -->
        <rule name="Robots" enabled="true" stopProcessing="true" patternSyntax="Wildcard" xdt:Transform="Insert">
          <match url="robots.txt" />
            <add input="{HTTP_HOST}" pattern="www.*.*" ignoreCase="true" />
          <action type="Rewrite" url="/Resources/Robots/robots.{C:1}.txt" />

You will have



so if user comes to it will be redirected to

this way it will resolve your physical file issue.


Ali, Murtaza

Jun 15, 2018 6:34

For a robots.txt file there is an option to use the plug-in POSSIBLE.RobotsTxtHandler to allow this to be configured dynamically for each site in your solution, rather than using static files.  This will present the site administrators with access to a text input field to manage the content of the file.  

If the administrators of your sites have requirements beyond a simple text entry field then you would need to develop a bespoke handler, but in most circumstances the POSSIBLE.RobotsTxtHandler plug-in will suffice.

Edited, Jun 15, 2018 10:12
<p>Hi Murtaza</p> <p>Thanks for the reply.&nbsp;</p> <p>Will the crawlers be able to read the file if there is a redirect in place. I am not sure. Also we need to place some other config and feed files in the root which are read by third party apps. We have seen issues with automated programs reading the file if there are redirects in place.</p> <p>Is there any other option ?</p> <p>We can actually write code to dynamically show output via a controler but would want to avoid that as it would create a lot of work for something very simple.</p> <p>Kind Regards</p> <p>Sandeep</p> <p></p>
Jun 15, 2018 10:16

You don't need a physical file at all. You can ccreate a controller and map it to whatever URL you want. Then you can put logic inside the controller/action to render different contentent depending on the current domain/website.

Jun 15, 2018 13:03
<p>Thanks Johan</p> <p>I had just mentioned the same in my comment above. I was hoping for some asy cheeat which would save some dev time.</p> <p>Kind Regards</p> <p>Sandeep</p>
Jun 15, 2018 22:16

Hey Sandeep

By the sounds of it POSSIBLE.RobotsTxtHandler will do exactly what you need (as Darren has described above). Just install it and go, no dev time needed apart from the install itself.


Jun 17, 2018 21:01
This topic was created over six months ago and has been resolved. If you have a similar question, please create a new topic and refer to this one.
* You are NOT allowed to include any hyperlinks in the post because your account hasn't associated to your company. User profile should be updated.