There's no out of the box funtionalliy for Sitemaps, are you sure you haven't installed Geta Sitemaps plugin or something similar? Or are you referring to somethine different, if so can you expand?
Sorry Scott, perhaps I'm confusing things by talking about sitemaps. That's a side issue really.
Our instance of EpiServer has a checkbox (in the "SEO" tab) and under the Meta Description field. The box is labelled "Disable Search Engine Indexing".
When checked the page doesn't appear in search engines. But I cannot figure out why not e.g. what this checkbox is doing to the page to prevent it from being indexed. It doesn't add it to the site's robots.txt file; it doesn't add a meta-robots directive - so I'm wondering how it's noindexing it?
Obviously there's no fields in a standard blank site, I've just checked the Alloy Demo and I can't find this field at all. I would suggest it's been custom built or added via a package. If there's no robots I'd suggest checking for meta tags in the HTML head
I suspect you are using Alloy Demo Kit (https://github.com/episerver/AlloyDemoKit) which has a similar function. As Scott says, this is simple implementation.
If you are interested in how it works please see the code below which injects meta robots values into the HTML header
@if (Model.CurrentPage.DisableIndexing)
{ <meta name="robots" content="NOINDEX, NOFOLLOW"> }
Hello. Can anyone confirm how the funcationality in EpiServer to hide/remove a page from the search engines works?
I can confirm it does work - but I cannot see how. It's not in the robots.txt file; there doesn't appear to be a meta robots command on the page etc.
It does still put those pages in the sitemap (which seems odd... as it's effectively playing hide-and-seek with GoogleBot etc.)
Any light shed would be appreciated!