November Happy Hour will be moved to Thursday December 5th.

DXC - Different physical files for multiple sites using same source code - robots.txt

Vote:
 

We have a website we are building which is deployed to the DXC and has multiple sites using different domains -

www.abc.com

www.def.com

...

and so on 

All the sites use the same source code and we have created various start/home pages in the cms, and each site points to its own home page.

All seems to be working fine. We have however come accross an issue of actual phyiscal files in the code which need to be different.

We have to add the robot.txt file to the project but this needs to be unique to each site/domain we have created. 

In a normal scenario, as this is a phyiscal file in the root folder it is quite simplet to add.

However in this case where we need to keep multiple versions it is not possible as we have one only common code base.

Could someone please advice the quickest way forward to add this at each domain level.

Kind Regards

Sandeep

#194174
Edited, Jun 15, 2018 1:10
Vote:
 

Hi Sandeep,

You can add rewrite rules

  <!-- Rewrite robots file requests for sites -->
        <rule name="Robots" enabled="true" stopProcessing="true" patternSyntax="Wildcard" xdt:Transform="Insert">
          <match url="robots.txt" />
          <conditions>
            <add input="{HTTP_HOST}" pattern="www.*.*" ignoreCase="true" />
          </conditions>
          <action type="Rewrite" url="/Resources/Robots/robots.{C:1}.txt" />
        </rule>

You will have

/resources/Robots/robots.abc.txt

/resources/Robots/robots.def.txt

so if user comes to abc.com/robots.txt it will be redirected to abc.com/resources/robots/robots.abc.com

this way it will resolve your physical file issue.

Cheers

Ali, Murtaza

#194176
Jun 15, 2018 6:34
Vote:
 

For a robots.txt file there is an option to use the plug-in POSSIBLE.RobotsTxtHandler to allow this to be configured dynamically for each site in your solution, rather than using static files.  This will present the site administrators with access to a text input field to manage the content of the file.  

If the administrators of your sites have requirements beyond a simple text entry field then you would need to develop a bespoke handler, but in most circumstances the POSSIBLE.RobotsTxtHandler plug-in will suffice.

#194183
Edited, Jun 15, 2018 10:12
Vote:
 
<p>Hi Murtaza</p> <p>Thanks for the reply.&nbsp;</p> <p>Will the crawlers be able to read the file if there is a redirect in place. I am not sure. Also we need to place some other config and feed files in the root which are read by third party apps. We have seen issues with automated programs reading the file if there are redirects in place.</p> <p>Is there any other option ?</p> <p>We can actually write code to dynamically show output via a controler but would want to avoid that as it would create a lot of work for something very simple.</p> <p>Kind Regards</p> <p>Sandeep</p> <p></p>
#194184
Jun 15, 2018 10:16
Vote:
 

You don't need a physical file at all. You can ccreate a controller and map it to whatever URL you want. Then you can put logic inside the controller/action to render different contentent depending on the current domain/website.

#194194
Jun 15, 2018 13:03
Vote:
 
<p>Thanks Johan</p> <p>I had just mentioned the same in my comment above. I was hoping for some asy cheeat which would save some dev time.</p> <p>Kind Regards</p> <p>Sandeep</p>
#194218
Jun 15, 2018 22:16
Vote:
 

Hey Sandeep

By the sounds of it POSSIBLE.RobotsTxtHandler will do exactly what you need (as Darren has described above). Just install it and go, no dev time needed apart from the install itself.

https://nuget.episerver.com/package/?id=POSSIBLE.RobotsTxtHandler

David

#194244
Jun 17, 2018 21:01
This topic was created over six months ago and has been resolved. If you have a similar question, please create a new topic and refer to this one.
* You are NOT allowed to include any hyperlinks in the post because your account hasn't associated to your company. User profile should be updated.