November Happy Hour will be moved to Thursday December 5th.
November Happy Hour will be moved to Thursday December 5th.
I'm not familiar with SqlBlobModelRepository - is it your custom code?
It looks like DDS anyway, so you have more than 1 problem here.
Where is the exception thrown?
Ok so it seems your code is a modified version of this https://github.com/BVNetwork/SqlBlobProvider/blob/master/src/EPiCode.SqlBlobProvider/SqlBlobExportJob.cs
From what I understand, Items<T> does not load the items to memory, which makes sense, but keep an open connection. That is likely causing your time out.
What you can try, is to something like this
var ids = store.Items<SqlBlobModel>().Select(x=>x.Id).ToList();
foreach(Identity id in ids)
{
var item = store.Load<SqlBlobModel>(id);
//do your stuff here
}
You can also increase the time out setting which will likely help.
Thanks again Quan Mai!
Here is my final solution.
Code:
using EPiServer.PlugIn;
using System.IO;
using System.Linq;
namespace EPiCode.SqlBlobProvider
{
[ScheduledPlugIn(DisplayName = "Export SQL Blobs", Description = "Exports all SQL Blobs to disk", SortIndex = 10001)]
public class ExportSqlBlobsJob : EPiServer.Scheduler.ScheduledJobBase
{
private bool _stopSignaled;
private string Status = string.Empty;
private int Exported = 0;
private string SaveDirectory = @"../../Blobs";
public ExportSqlBlobsJob()
{
IsStoppable = true;
}
public override void Stop()
{
_stopSignaled = true;
}
public override string Execute()
{
OnStatusChanged("Export started.");
ExportBlobs();
Status += $"Job has completed. {Exported} SQL blobs has been exported to {SaveDirectory}.";
OnStatusChanged(Status);
return Status;
}
private void ExportBlobs()
{
var store = SqlBlobModelRepository.SqlBlobStore;
var ids = store.Items<SqlBlobModel>().OrderByDescending(x => x.Blob).Select(x => x.Id);
int counter = 0;
foreach (var id in ids)
{
var item = store.Load<SqlBlobModel>(id);
if (_stopSignaled)
{
OnStatusChanged("Stop signal called.");
break;
}
var id2 = item.BlobId;
var path = SaveDirectory + id2.Segments[0] + id2.Segments[1] + id2.Segments[2].TrimEnd('\\');
if (File.Exists(path))
{
counter++;
OnStatusChanged($"Exists skipping: {counter} Exported: {Exported}");
continue;
}
OnStatusChanged($"Exporting id: {id} to directory: {path}");
var directoryPath = Directory.CreateDirectory(Path.GetDirectoryName(path));
if (item.Blob != null)
{
var stream = new MemoryStream(item.Blob);
var fileStream = File.Create(path);
stream.CopyTo(fileStream);
fileStream.Close();
stream.Close();
}
Exported++;
}
}
}
}
web.config:
<section name="episerver.dataStore" type="EPiServer.Data.Configuration.EPiServerDataStoreSection, EPiServer.Framework.AspNet" />
<episerver.dataStore>
<dataStore defaultProvider="EPiServerSQLServerDataStoreProvider">
</dataStore>
<dataSettings connectionStringName="EPiServerDB"
retries="3"
retryDelay="0:0:0.1"
databaseQueryTimeout="0:05:00" />
</episerver.dataStore>
I know the code isn't optimized to it's full potential - hopefully this job only needs to run once now that it works.
Most results when i searched for help was the episerver.datastore said it the DLL should be EPiServer.Data - i finally found if you're using EPiServer 11 or higher you should use EPiServer.Framework.AspNet.
The databaseQueryTimeout (databaseQueryTimeout="0:05:00") is now set to 5 minutes which i hope is enough, you could increase it but if something is wrong you will have to wait for the amount of time you set before the connection closes.
I added <episerver.dataStore> directly under the closing </episerver> tag if someone, like myself, dont have great knowledge in how everything worksin configuration files. :)
It now has Exported: 9387 so i'm pretty sure it works. :)
Hi again,
When the program ran for 2 hours yesterday i got a "OutOfMemoryException" - how I solved it was I ordered the list in reverse and tried again, the code ran for 2 hours again and i got the same Exception. I still saw it skipped about 12.000 files before it stoped so it should be fine. It was about 13 - 14.000 files but now i got it fixed. If you have more files then I had you can look at msdn "Trubleshoot outofmemoryexception" for the job. docs.microsoft.com/en-us/troubleshoot/aspnet/troubleshoot-outofmemoryexception
I've copied all files to our DXP and it looks like it's working.
Hopefully this helps someone! :)
Hi,
I'm trying to migrate to EPiServer DXP and my problem is our solution uses SqlBlobProvider (blobs in the database) - which EPiServers support say is not recomended. I've tried to export the blobs to disk using the code i will post later on. The export times out with SqlException and i will post the exception (in Swedish - sorry!) later on aswell.
My questions are:
How do i set an episerver scheduled job not to timeout when exporting big files (im guessing that's one problem)? This might be an SQL-problem or the SqlBlobProvider that needs configuration.
Is there a better way of doing this then my code below?
Can i export the files directly to episerver blob storage?
How do i give access to EPiServers scheduled job write access so i can write out the files to FTP? Then download, then upload to the blob storage.
Will any of this solve my problem?
Code for export:
SqlException:
Hopefully someone can shed some light to this matter so we can progress and migrate to DXP. :)
Kind regards,
Robin