November Happy Hour will be moved to Thursday December 5th.
November Happy Hour will be moved to Thursday December 5th.
By caching you refer to case when newly added content is available on second server but second server is hitting database instead of cache?
Hi Valdis almost, the content is not cached and alot of the content on these pages are fetched with a servicecall. (Data from the service is what is fetched with the scheduled job). What happens is that when a visitor navigates to the page with this data instead of getting from cache it has to fetch it with a new call to the service.
The cache in CMS works so that (somewhat simplified) when a request comes into ContentRepository a check is done if item is present in cache, if so it is delivered from cache and if not loaded from db and then added to cache and then returned.
Regarding to loadbalancing only cache evict messages are sent between the servers, meaning if an item is updated on one server that server will send messages to the other servers that the item should be evicted and hence the other servers evict the item from cache. However the item will not be automatically repopulated to cache, instead the next request for the item will cause it to be loaded and inserted into cache.
If I understand you correct you would like to repolpulate (meaning reload it) the cache when an item is evicted? One way to do this in theory (I have not tested this) would be to hook up an event handler to remote event "RaisedEvent" in the same way as class RemoteCacheSynchronization (see how it's done through reflection) and in the eventhandler parse the key to find out what item the message is about (you can see how the keys are constructed from methods in DataFactoryCache) and then load the item (causing it get into the cache). Note that your event handler must be registered after the one added by RemoteCacheSynchronization (can be done by accessing that instance from IOC container before hooking up your event handler).
Thanks Johan, then we assumed correct and will know our approach on what to achieve.
Hello I got a question regarding running scheduled job in a loadbalancing environment. When the job is running on one server the cache is built on that server, when a user reaches the second webserver this content is not cached. We would like to ensure somehow to build the cache on the server not running the scheduled job.
We are using the EpiServer cachemanager not runtime cache. What we have seen so for is that the cache is only propagated to the the second server when items are deleted and not when they are created.
(In our specific example we have consumed data from a service and create this content in EPiServer and is displayed as a list with a regular block, on the server running the scheduled job this info is cached but not on the second server)