Ensure correct Google indexing when using multiple domain names
It is very common today that a website has multiple domain names that it responds to. In fact, this is a best practice used by great marketing organizations as well as among the leading hosting providers in the market. For marketing-focused organizations, it is about driving traffic to your site and promoting your brands. As an example, we know that many of our Everweb customers have a corporate website which responds to the corporate name, but they also market their individual brands. Another common best practice scenario for hosting providers is to use an internal name for monitoring to ensure that the site is available. This practice is used by EPiServer Everweb, since we know that it is key for our customers to have a website, and applications, available for the visitor.
These practices have some implications for your search strategy however. When using more than one domain name, I would recommend that you ensure that the Google indexing engine, or other relevant search index, picks up the content from the correct domain name. You may not want a domain name used for a brand to get a higher (Google) ranking than the name of the corporation and you will probably not want an internal name used for monitoring to get a higher ranking than the name of your company.
Luckily Google & Microsoft have a neat set of tools to prevent an incorrect ranking of your domain names and even better, to correct it if something goes wrong.
How to prevent certain domains from getting indexed
With the use of robots.txt, you can control many things about how Google indexes your website. Since robots.txt is a static textfile, we need to use a different approach to make it dynamic and to deliver different content dependent on domain name.
This is what your robots.txt could look like. Change the AllowUrl to represent the URL you want indexed.
<%@ WebHandler Language="C#" Class="MyNamespace.robotshandler" %>
using System;
using System.Web;
namespace MyNamespace
{
public class robotshandler: IHttpHandler
{
public void ProcessRequest (HttpContext context)
{
Uri RequestUrl = context.Request.Url;
string AllowUrl = "";
context.Response.ContentType = "text/plain";
context.Response.Write("User-agent: *\n");
if (RequestUrl.Host == AllowUrl)
{
// Allow for one url
context.Response.Write("Allow: /\n");
}
else
{
// Disallow for all others
context.Response.Write("Disallow: /\n");
}
}
public bool IsReusable
{
get {return false;}
}
}
}
In web.config under section <system.webServer> and <handlers> add the following:
<add name="SimpleHandlerFactory" path="robots.txt" verb="*" type="System.Web.UI.SimpleHandlerFactory" resourceType="Unspecified" preCondition="integratedMode" />
In web.config under <system.web> and <compilation>
add the following:
<buildProviders>
<add extension=".txt" type="System.Web.Compilation.WebHandlerBuildProvider" />
</buildProviders>
This example can be customized to present more than one domain name.
You can find more information here: http://www.kleenecode.net/2007/11/17/dynamic-robotstxt-with-aspnet-20/
Changing incorrectly indexed domain names
If you have a domain name that has been indexed and you want it removed, Google provides tools for this as well.
The tools are called Google webmaster tools and can be found here:
http://www.google.com/webmasters/tools/
In webmaster tools you will find instructions on how to connect your domains to your Google account and instructions on how to remove the index.
Comments