Cache Dynamic Data Store items
When I use DDS I always use a pattern where I have a public static Items implementation that I can do my query against. This logic is placed inside a common class all my DDS tables inherit from
- public class BaseData<T> : IDynamicData, ISaveMe where T : BaseData<T>, new()
- {
- public EPiServer.Data.Identity Id { get; set; }
- public static IOrderedQueryable<T> Items
- {
- get
- {
- return Store.Items<T>();
- }
- }
So when I needed to speed things up a bit I changed my Items implementation to instead return a memory list with all my items,
- public static IQueryable<T> Items
- {
- get
- {
- if (Cache())
- return ItemsMemory;
- return Store.Items<T>();
- }
- }
- public static bool Cache()
- {
- string tmp = System.Web.Configuration.WebConfigurationManager.AppSettings.Get("Cache_"+typeof(T).Name);
- if (!string.IsNullOrEmpty(tmp))
- return true;
- return false;
- }
- static IQueryable<T> ItemsMemory
- {
- get
- {
- if (_itemsMemory == null)
- {
- WriteNewCacheData();
- }
- return _itemsMemory.AsQueryable();
- }
- }
- static List<T> _itemsMemory = null;
- static object lockObject = new object();
- static DateTime _lastCleard = DateTime.MinValue;
- public static void WriteNewCacheData()
- {
- if (Cache())
- {
- WriteNewCacheData((from item2 in Store.Items<T>() select item2).ToList());
- }
- }
- static void WriteNewCacheData(List<T> data)
- {
- lock (lockObject)
- {
- _lastCleard = DateTime.Now;
- _itemsMemory = data;
- }
- }
Since all my DDS classes inherits from the same base, I made my change so I could turn on memory cache using appsettings.
When I retrieve a object and want to make a change to I just make the change and save it using my LazyDDSSave class. Even before the item is saved all new querys will access the changed object since it’s the same object. One could make a CreateWritebleClone implementation update the cache when one have done the save, but I didn’t.
Its only when we create a new object or delete a object we need to change the number of elements in the memory cache.
- public virtual void Save()
- {
- LazyDSSSave.Current.AddToSave(this);
- }
- public static void Save(T item)
- {
- LazyDSSSave.Current.AddToSave(item);
- }
- public static void Delete(T item)
- {
- Store.Delete(item);
- if (Cache())
- {
- WriteNewCacheData((from item2 in Store.Items<T>() select item2).ToList());
- }
- }
- public void SaveMe()
- {
- Store.Save(this);
- if (Cache())
- {
- var list = (_itemsMemory as List<T>);
- if (list != null)
- if (list.IndexOf(this as T) == -1)
- {
- WriteNewCacheData((from item in Store.Items<T>() select item).ToList());
- }
- }
- }
I have selected a full reread from the data store when I delete or add a new item, but this could also be changed to just add or delete the object from the memory list.
If you are in a enterprise load balance server situation one could either implement a event based reload or one could reload the memory list after a fixed amount of time.
I gain a lot performance by just using my implementation above in a project I worked on. I had many updates on objects, and not very many delete or new ones.
Since I then cache all my DDS tables (or most of them) I don’t need to cache my results from query's against my DDS tables So when updates are done I don’t need to invalidate my aggregate cache. That saves me a lot of worries .
Have uploaded the base class in the code section here
I have a similar setup when working with DDS.
Great inspiration how to improve it - Thanks!