Daniel Österhof
Jun 21, 2011
  7704
(2 votes)

Ensure correct Google indexing when using multiple domain names

It is very common today that a website has multiple domain names that it responds to. In fact, this is a best practice used by great marketing organizations as well as among the leading hosting providers in the market. For marketing-focused organizations, it is about driving traffic to your site and promoting your brands. As an example, we know that many of our Everweb customers have a corporate website which responds to the corporate name, but they also market their individual brands. Another common best practice scenario for hosting providers is to use an internal name for monitoring to ensure that the site is available. This practice is used by EPiServer Everweb, since we know that it is key for our customers to have a website, and applications, available for the visitor.

These practices have some implications for your search strategy however. When using more than one domain name, I would recommend that you ensure that the Google indexing engine, or other relevant search index, picks up the content from the correct domain name. You may not want a domain name used for a brand to get a higher (Google) ranking than the name of the corporation and you will probably not want an internal name used for monitoring to get a higher ranking than the name of your company.

Luckily Google & Microsoft have a neat set of tools to prevent an incorrect ranking of your domain names and even better, to correct it if something goes wrong.

 

How to prevent certain domains from getting indexed

With the use of robots.txt, you can control many things about how Google indexes your website. Since robots.txt is a static textfile, we need to use a different approach to make it dynamic and to deliver different content dependent on domain name.

 

This is what your robots.txt could look like. Change the AllowUrl to represent the URL you want indexed.

<%@ WebHandler Language="C#" Class="MyNamespace.robotshandler" %>
 
using System;
using System.Web;
 
namespace MyNamespace 
{
    public class robotshandler: IHttpHandler 
    {
        public void ProcessRequest (HttpContext context) 
        {
            Uri RequestUrl = context.Request.Url;
            string AllowUrl = "";
            
            context.Response.ContentType = "text/plain";
            context.Response.Write("User-agent: *\n");
 
 
            if (RequestUrl.Host == AllowUrl)
            {
                // Allow for one url
                context.Response.Write("Allow: /\n");
            } 
            else 
            {
                // Disallow for all others
                context.Response.Write("Disallow: /\n");
            }
        }
    
        public bool IsReusable 
        {
            get {return false;}
        }
    }
}

 

In web.config under section <system.webServer> and <handlers> add the following:

<add name="SimpleHandlerFactory" path="robots.txt" verb="*" type="System.Web.UI.SimpleHandlerFactory" resourceType="Unspecified" preCondition="integratedMode" />
 
In web.config under <system.web> and <compilation> 
add the following:
<buildProviders>
   <add extension=".txt" type="System.Web.Compilation.WebHandlerBuildProvider" />
</buildProviders>

This example can be customized to present more than one domain name.

 

You can find more information here: http://www.kleenecode.net/2007/11/17/dynamic-robotstxt-with-aspnet-20/

 

 

Changing incorrectly indexed domain names

If you have a domain name that has been indexed and you want it removed, Google provides tools for this as well.

The tools are called Google webmaster tools and can be found here:

http://www.google.com/webmasters/tools/

In webmaster tools you will find instructions on how to connect your domains to your Google account and instructions on how to remove the index.

Jun 21, 2011

Comments

Please login to comment.
Latest blogs
Performance optimization – the hardcore series – part 2

Earlier we started a new series about performance optimization, here Performance optimization – the hardcore series – part 1 – Quan Mai’s blog...

Quan Mai | Oct 4, 2023 | Syndicated blog

Our first steps into local AI

After a consumer of tools like ChatGPT and CoPilot, we as developers like to dive deeper into it. How does it work? Where to start? Can I create my...

Chuhukon | Oct 4, 2023 | Syndicated blog

Update on .NET 8 support

With .NET 8 now in release candidate stage I want to share an update about our current thinking about .NET 8 support in Optimizely CMS and Customiz...

Magnus Rahl | Oct 3, 2023

Adding Block Specific JavaScript and CSS to the body and head of the page

A common requirement for CMS implementations includes the ability to embed third party content into a website as if it formed part of the website...

Mark Stott | Oct 3, 2023

Performance optimization – the hardcore series – part 1

Hi again every body. New day – new thing to write about. today we will talk about memory allocation, and effect it has on your website performance....

Quan Mai | Oct 3, 2023 | Syndicated blog

Next level content delivery with Optimizely Graph

Optimizely introduced a new product called Optimizely Graph earlier this year. We were one of the first partners to adopt this new service in a...

Ynze | Oct 2, 2023 | Syndicated blog