Cache and GoogleBot
In our production we’ve faced with interesting issue. Sometimes, on some pages users experienced something-not-working errors. For example, AJAX search button worked like a simple label – you can click it, but nothing happens. These errors can be seen for a small unpredictable periods of time by all the users simultaneously and after that all worked fine until next such error.
So it seemed like some kind of cache-related error, but how an invalid page could be put in the cache?
After some investigation we recalled that ASP.Net is really smart thing. It doesn’t render things if a browser doesn’t support them. So, when, for example, Google asks for our page, it receives a “light version” page without things like __doPostBack implementation. And it’s really hard to do postback without.. hm.. postback.
But why our users see pages generated for Google?
It’s simple. Google… I like to say it IS Google, but actually it could be any of that users with that rarely used browsers which doesn’t support anything except GET thing. So, say, Google asked for page, and if it wasn’t in output cache, it was generated and put in the output cache. And all users seen that “light” page until it expired.
Yes, I mean it: one user can force partially nonworking site functionality for all the others!
Ok, next in our show: how to reproduce and ways to fix.
But before, to understand what’s going on, I highly recommend reading great post The EPiServer CMS Output Cache Explained by Joel Abrahamsson.
Create simple EPiServer site from template, add page which use __doPostBack, like this one:
Turn on caching – episerver.config, sites/site/sitesettings node:
Run the site, create the page, and open it in new tab (cache not working in Preview). Log off (page caches only for anonymous). Ensure page is cached – click Refresh and see that page generated time is not changing. Ensure LinkButton is working – time is updating on click.
Install User Agent Switcher add-on https://addons.mozilla.org/en-us/firefox/addon/user-agent-switcher/ for Firefox. In Firefox, select Tools – Default User Agent – Search Robots – GoogleBot.
Restart web site. We’re ready.
Open the page using “Googlebot” in the Firefox (you are “Google” now). Try to click button and see it not working.
Open the same page in another browser, say, IE (now you are “normal user”) – and see that buttons not working! You’ve been sent a raw page generated for Google! Weird.
You can wait until cache expired and reload the page in IE – it will work then.
Q. E. D.
How to fix
There are several ways of avoiding such behavior.
- Disable output cache.
It will work for small sites. But not in real world.
- Use native ASP.Net httpCacheVaryByCustom="browser" setting.
There are several side-effects. First of all, a page will be cached separately for each browser and its major version combination, which will dramatically increase memory used by server output cache. Another thing is that EPiServer may believe in “path” setting here to vary cache by Url.Path.
- Provide default capabilities for all the browsers in the App_Browsers\*.browser file, something like this one:
This will force ASP.Net to render the same HTML for all the browsers.
Do you know better solution?