Try our conversational search powered by Generative AI!

Mark Stott
Dec 15, 2021
  3704
(5 votes)

A Robots.Txt Handler for Optimizely CMS 12

Stott.Optimizely.RobotsHandler v1.0.1 Released

Stott.Optimizely.RobotsHandler is a new robots.txt handler for Optimizely CMS 12 that fully supports multi-site builds with potentially different robots.txt content to be delivered by site. This package is inspired by work previously delivered with POSSIBLE.RobotsTxtHandler which was built for CMS 11.  Stott.Optimizely.RobotsHandler has been built from the ground up initially as a learning exercise building on lessons learned in my previous post: Extending The Admin Interface in Optimizely CMS 12.

The Interface

The interface is built using a standard .NET 5.0 MVC Controller with a Razor view with a small supporting JS file that is compiled as a Razor Class Library.  The benefit of building this as a Razor Class Library is that Nuget only needs to provide the DLL keeping your solution otherwise clean of any artefacts from the package. 

The UI is a single page using Bootstrap 5.0 and JQuery that renders a complete list of all sites configured within the CMS instance.

The content of the robots.txt for any site is shown in a modal dialog and saved via an API call that stores the robots.txt content into the Dynamic Data Store.  Custom tables have not been used as the expection is that there will be a 1-to-1 relationship between sites and their robots.txt content.

Installation & Configuration

Installation is straight forward, the Stott.Optimizely.RobotsHandler package can be installed either from the Optimizely nuget feed or from the nuget.org feed.  You will then need to add the following lines to the Startup class in your .NET 5.0 solution:

public void ConfigureServices(IServiceCollection services)
{
    services.AddRazorPages();
    services.AddRobotsHandler();
}

The call to services.AddRazorPages() is a standard .NET 5.0 call to ensure razor pages are included in your solution.

The call to services.AddRobotsHandler() sets up the dependency injection requirements for the RobotsHandler solution and is required to ensure the solution works as intended. This works by following the Services Extensions pattern defined by microsoft.

Resolving robots.txt content

A standard controller is configured to respond to requests to www.example.com/robots.txt. When recieving the request, the controller interogates the domain of the request and uses this to resolve the relevant site and the returns the robots.txt content for that site.  If not content has been previously defined for the site, the the default content will be returned as follows:

User-agent: *
Disallow: /episerver/
Disallow: /utils/

Contributing and Licencing

Stott.Optimizely.RobotsHandler has been built and uses the MIT licence.  If you find any defects with the package, then please log them as issues on the the repositories issues page. If you would like to contribute to changes for the solution, then feel free to clone the repository and submit a pull request against the develop branch.

Dec 15, 2021

Comments

Per Nergård
Per Nergård Jun 15, 2023 03:06 PM

This was a nice one. Worked like a charm!

Please login to comment.
Latest blogs
New Series: Building a .NET Core headless site on Optimizely Graph and SaaS CMS

Welcome to this new multi-post series where you can follow along as I indulge in yet another crazy experiment: Can we make our beloved Alloy site r...

Allan Thraen | Jun 14, 2024 | Syndicated blog

Inspect In Index is finally back

EPiCode.InspectInIndex was released 9 years ago . The Search and Navigation addon is now finally upgraded to support Optimizely CMS 12....

Haakon Peder Haugsten | Jun 14, 2024

Change the IP HTTP Header used for geo-lookup in Application Insights

.

Johan Kronberg | Jun 10, 2024 | Syndicated blog

Copying property values

In this article I’d like to show simple Edit Mode extension for copying property values to other language versions. In one of my previous blogposts...

Grzegorz Wiecheć | Jun 8, 2024 | Syndicated blog