AI OnAI Off
This was easy to fix, just override SetCacheSettings (the one with IEnumerable<GetChildrenReferenceResult>) in the provider:
protected override void SetCacheSettings(ContentReference contentReference, IEnumerable<GetChildrenReferenceResult> children, CacheSettings cacheSettings) { // Set a low cache setting so new items are fetched from data source, but keep the // items already fetched for a long time in the cache. cacheSettings.SlidingExpiration = TimeSpan.FromSeconds(30); base.SetCacheSettings(contentReference, children, cacheSettings); }
Hi Johan!
This works if you are on the same domain i can see. But we use it on a multisite with several domain and appPools, any idea how to do then?
That should not really be a problem. Sure, there can be some discrepancy in the cache. But in my case I was not using any of the list methods in IContentLoader, only in edit mode.
Hi,
I've created a custom content provider. I want to set a low cache setting on list methods (GetChildren e.g.) but still keep all IContent objects in the cache, since the construction of these objects are very expensive. Is there an easy way to do this with the built-in cache, or do I need to add caching between my content provider and data source? I've been looking at
But that method seems clear to the IContent objects as well, not just the ContentReferences in the children list?
The reason for all this is that I don't know when new objects are added or removed from the data source, but once they are added, they're pretty static.
Thanks!