November Happy Hour will be moved to Thursday December 5th.

Allan Thraen
Feb 27, 2015
  5690
(3 votes)

Moving from a SqlBlobProvider to something else…

I’m currently having a lot of fun out of updating and moving some demo sites to Azure Websites. When you are moving to Azure you obviously need another blob storage than the default file-based blob storage and 2 popular options comes to mind: Azure Blob storage or – just to simplify everything and use the SQL Blob storage.

In one project I decided on going with the SQL Blob storage from https://www.coderesort.com/p/epicode/browser/EPiCode.SqlBlobProvider. It’s so easy. 2 Nuget packages installed, and 1 scheduled job later, and all my blobs are working straight from the database…One less dependency to worry about!

“Why were you worried about depedencies?” I hear you cry. Well – - I ‘m just the kinda guy who thinks simpler is better. Also – for this specific use case, a demo site, I’d like to be able to easily ‘reset’ the site every night and just focusing on the resetting/restoring the database sounded nice to a lazy guy like me.

Well. Functionality wise, the SQLBlobProvider works like a charm. Mostly. At first we didn’t think much about it, but it did seem like the site was quite slow..Especially in edit mode. It’s not that I had that many blobs – maybe just around 100 images on the entire site, no videos, no sound, no big documents. But each image had 5 different defined resolutions and some might even have had multiple versions – and then we are looking at 100*5*versions blobs in SQL.

Now, the way the SQLBlobProvider is implemented in SQL is using the wonderful Dynamic Data Storage. Which is pretty awesome. And in general scales well. But I’m not sure it was meant for that many byte[] collections to be passed back and forth between the webserver and the database all the time. I don’t know exactly where the bottleneck was..But the site was a pain to work with. (SQLBlobProvier does have the ability to cache locally on disk but I never managed to get that to work on my setup – so whether that would have helped I can’t say).

After trying many other great speed tricks with no major luck, I decided that it must be the SQLBlobProvider and I decided to switch to the Azure Blob Storage. Installing the Azure Blob Provider and setting up the storage is super easy and takes no time at all – but then I was faced with the next challenge – how to move all my blobs from SQL to Azure Storage. Again, on EPiCode (https://nuget.episerver.com/en/OtherPages/Package/?packageId=EPiCode.blobconverter) there is a nice BlobConverter tool but it assumes all blobs are on disk (file Blobs). Somehow I didn’t feel like downloading every image 1 by 1 to my local machine, re-uploading them and re-inserting them wherever they should be. EPiServer Import & Export didn’t really seem like a smooth option for this either. So…I ended up writing this little code that worked like  charm. If anyone else find themselves in this situation, feel free to use it (but before you do make sure you have configured a new blob provider and it’s set to be the current /default).

using System;
using EPiServer.Core;
using EPiServer.PlugIn;
using EPiServer.BaseLibrary.Scheduling;
using EPiServer.Data.Dynamic;
using EPiCode.SqlBlobProvider;
using System.Linq;
using EPiServer.Framework.Blobs;
using System.IO;

namespace EPiServer.Demo.QJet.Business.Jobs
{
    [ScheduledPlugIn(DisplayName = "Convert Sql Blobs Job")]
    public class ConvertSqlBlobsJob : JobBase
    {
        private bool _stopSignaled;

        public ConvertSqlBlobsJob()
        {
            IsStoppable = true;
        }

        /// <summary>
        /// Called when a user clicks on Stop for a manually started job, or when ASP.NET shuts down.
        /// </summary>
        public override void Stop()
        {
            _stopSignaled = true;
        }

        /// <summary>
        /// Called when a scheduled job executes
        /// </summary>
        /// <returns>A status message to be stored in the database log and visible from admin mode</returns>
        public override string Execute()
        {
            //Call OnStatusChanged to periodically notify progress of job for manually started jobs
            OnStatusChanged(String.Format("Starting execution of {0}", this.GetType()));
            int cnt = 0;
            //Add implementation
            var store=DynamicDataStoreFactory.Instance.GetStore(typeof(SqlBlobModel));
            foreach (var b in store.Items<SqlBlobModel>())
            {
                if (b.Blob != null) { 
                    var ms=new MemoryStream(b.Blob);
                    var bb = BlobFactory.Instance.GetBlob(b.BlobId);
                    bb.Write(ms);
                    cnt++;
                    if ((cnt % 50) == 0) OnStatusChanged("Converted " + cnt.ToString() + " blobs");
                }
            }

            //For long running jobs periodically check if stop is signaled and if so stop execution
            if (_stopSignaled)
            {
                return "Stop of job was called";
            }

            return "Change to message that describes outcome of execution";
        }
    }
}
Feb 27, 2015

Comments

Henrik Fransas
Henrik Fransas Feb 28, 2015 07:50 PM

How did you find out that it was the blob that was taking time, got any tips on what tool to use?

Feb 28, 2015 10:35 PM

@Henrik: I wish I had some magic cool tools here (and I'm sure they exist). But in this case I can't positively say that the SqlBlobProvider was the cause of my problems - I can just observe that among the many things I tried to speed up the site to usable speed, changing the blob provider was the only thing that had a noticeable impact without causing other problems to my site (like - the old trick with turning off session state might have made the site a bit faster as well - but it also broke some of the functionality so it was a no go). The problems were most expressed when using edit-mode. I noticed a similar site running on similar hosting didn't have the same problem - and the main difference between the sites was the blob storage.
Now, I'm not in general going to stop using the SqlBlobProvider - as I think it is a neat and practical solution. But I will be careful when I have a lot of blobs and performance is important...

I think the functionality in the SqlBlobProvider to cache locally on disk might just be the solution to any performance problems I might have encountered - but as I stated, I wasn't able to get it to work.

Henrik Fransas
Henrik Fransas Mar 1, 2015 08:12 AM

Thanks @allan.
A nice experiment would to have the exact same site with sql blob where one is in azure and one is local and see if there are any differens in performance.
One thing with Microsoft SQL Azure is that it is more sensitive for a very "talk'y" app (like you get with a ORM like nHibernate or Entity Framework) but I know that the DDS are not using that.
So many funny things to try out and so little time to do it on.... ;-)

K Khan
K Khan Mar 2, 2015 10:45 AM

Bandwidth is another aspect that should not be ignored ... http://world.episerver.com/blogs/K-Khan-/Dates/2014/12/using-blobs-and-cdn-both-effectively/
/K

Per Magne Skuseth
Per Magne Skuseth Oct 9, 2015 09:05 AM

FYI: The SqlBlobProvider now has both an export function and local disk cache, as well as a slight performance improvement.

Please login to comment.
Latest blogs
AsyncHelper can be considered harmful

.NET developers have been in the transition to move from synchronous APIs to asynchronous API. That was boosted a lot by await/async keyword of C#...

Quan Mai | Dec 4, 2024 | Syndicated blog

The search for dictionary key

Recently I helped to chase down a ghost (and you might be surprised to know that I, for most part, spend hours to be a ghostbuster, it could be fun...

Quan Mai | Dec 4, 2024 | Syndicated blog

Shared optimizely cart between non-optimizley front end site

E-commerce ecosystems often demand a seamless shopping experience where users can shop across multiple sites using a single cart. Sharing a cart...

PuneetGarg | Dec 3, 2024

CMS Core 12.22.0 delisted from Nuget feed

We have decided to delist version 12.22.0 of the CMS Core packages from our Nuget feed, following the discovery of a bug that affects rendering of...

Magnus Rahl | Dec 3, 2024