Hi Todd,
You can take a look on SqlBlobProvider project, on tag 1.5.2 is the last version for CMS 11.
https://github.com/BVNetwork/SqlBlobProvider/tree/v1.5.2
You will need to override the Blob class, which has OpenRead, OpenWrite, and some other methods and properties. This is where you need to return streams.
The BlobProvider also needs to be overridden. Here you will interact with the blob container to create, delete, and get blobs (returning your own blob class).
Thank you both very much, this is very helpful and seems pretty straight forward. Working through it now.
So I've finally got back to working through this, pretty straight forward however I have a fundimental question. I'm still struggling to see where I get a reference to the byte array of the file that has just been uploaded from the client. Our storage of these files will ultimately be in S3 buckets, but upon upload the file must come to the server in some temp directory. From there I'm expecting to have the file stream to upload to S3. So in SqlBlobProvider i see the following for CreateBlob()... which on my end, the Save() implementation is swapped out for S3 bucket upsert. I'm failing to see where SqlBlobProvider actually gets the file stream of the blob and stores it in sql.
public override Blob CreateBlob(Uri id, string extension)
{
var sqlBlobModel = new SqlBlobModel
{
BlobId = Blob.NewBlobIdentifier(id, extension)
};
SqlBlobModelRepository.Save(sqlBlobModel);
return GetBlob(sqlBlobModel.BlobId);
}
public static void Save(SqlBlobModel blob)
{
SqlBlobStore.Save(blob, blob.Id);
}
I'm definitely missing something here, but I'd expect something like the following
public override Blob CreateBlob(Uri id, string extension)
{
var blobId = Blob.NewBlobIdentifier(id, extension);
var blob = GetBlob(blobId);
var customBlobModel = new CustomBlobModel
{
BlobId = blobId,
Blob = blob.OpenWrite()?.ReadAllBytes()
};
CustomBlobModelRepository.Save(customBlobModel);
return blob;
}
public void Save(CustomBlobModel blob)
{
if (blob == null)
return;
var fileStream = new MemoryStream();
fileStream.Write(blob.Blob, 0, blob.Blob.Length);
var putRequest = new PutObjectRequest
{
Key = DetermineS3Key(blob.BlobId.Segments),
BucketName = _s3BucketName,
InputStream = fileStream,
DisablePayloadSigning = true
};
_s3Client.PutObject(putRequest);
}
It looks like the initial stream in OpenWrite was the folder, that's why the byte array length was 0. I was able to get read/write working.
Now my next issue is figuring out why Delete is not firing. After moving an asset to trash and then removing from trash the following code does not seem to fire
public override void Delete(Uri id)
However, I am able to bind to the DeletingContent event like such. Will this suffice?
var events = ServiceLocator.Current.GetInstance<IContentEvents>();
events.DeletingContent += DeleteSqlBlobProviderFiles;
Hi Todd,
Actually, Delete method in Blob Provider will be fired when this job "Remove Abandoned BLOBs" runs. So you only need to run this job manually or schedule to run this job to clean binary data for deleted media contents
Thank you Binh, I would not have found that. Once an asset is deleted from the trash bin, the Delete() override method in my custom blob provider fires when running the Remove Abandoned BLOBs job.
Our current CMS is version 11.20.7.0
Does anyone have an example of a custom blob storage provider? We're looking at putting blobs in Cloudflare R2 buckets and I can't seem to find an example of overriding BlobProvider.
My specific issue is, how do I get access to the FileStream of the uploaded file so that I can store it in the R2 bucket. When overriding CreateBlob I only see an id Uri. I must be understanding this incorrectly.
I have been to this doc but it doesn't seem to give any details on overriding BlobProvider
https://docs.developers.optimizely.com/content-management-system/v11.0.0-cms/docs/blob-storage-and-providers