Good question (and research)! I always recommend to use the APIs if possible, but this is one of the case when direct database access might be justified. You can query data from
You can do what Optimizely does in the background to get history of the job. It uses a storedProcedure netSchedulerListLog to fetch the results.
public virtual Task<PagedScheduledJobLogResult> GetJobLogAsync(Guid id, long startIndex, int maxCount)
{
IAsyncDatabaseExecutor db = _databaseHandler();
return db.ExecuteAsync(async delegate
{
using DbCommand cmd = db.CreateStoredProcedure("netSchedulerListLog");
cmd.Parameters.Add(db.CreateParameter("pkID", id));
cmd.Parameters.Add(db.CreateParameter("startIndex", startIndex));
cmd.Parameters.Add(db.CreateParameter("maxCount", maxCount));
long? totalCount = null;
using DbDataReader dbDataReader = await cmd.ExecuteReaderAsync().ConfigureAwait(continueOnCapturedContext: false);
List<ScheduledJobLogItem> list = new List<ScheduledJobLogItem>();
while (dbDataReader.Read())
{
if (!totalCount.HasValue)
{
totalCount = dbDataReader.GetInt64(6);
}
object obj = dbDataReader[3];
object obj2 = dbDataReader[4];
object obj3 = dbDataReader[5];
object obj4 = dbDataReader[1];
list.Add(new ScheduledJobLogItem
{
CompletedUtc = dbDataReader.GetDateTime(0),
Message = dbDataReader.GetStringSafe(2),
Status = ((obj4 != DBNull.Value) ? ((ScheduledJobExecutionStatus)Convert.ToInt32(obj4)) : ScheduledJobExecutionStatus.Unknown),
Duration = ((obj != DBNull.Value) ? new TimeSpan?(TimeSpan.FromTicks(Convert.ToInt64(obj))) : null),
Trigger = ((obj2 != DBNull.Value) ? ((ScheduledJobTrigger)Convert.ToInt32(obj2)) : ScheduledJobTrigger.Unknown),
Server = ((obj3 != DBNull.Value) ? Convert.ToString(obj3) : null)
});
}
return new PagedScheduledJobLogResult(list, totalCount.GetValueOrDefault());
});
}
You can do something similar - Add your own stored procedure with date filters on as migration step. It should almost be same procedure with one change to include date range.
I didn't test it but it should work. If nothing works at all then you can do what Quan suggested here.
that is what I believe IScheduledJobLogRepository does under the hood. it does not solve the problem OT had however - you virtually have to load all logs to filter by time frame, which is a heavy task (both on app and db layer), and unfeasible on a big db with a lot of log
How about doing it with a new storedprocedure with almost same logic as netSchedulerListLog and include date range as an additional parameter? Then extending IScheduledJobLogRepository with new method (similar to code snippet in my previous comment) that calls new stored procedure with date range included.
Hi Eeta,
If this was me doing this I would go with Quans approach, as he mentioned this is not the norm to suggest direct database access, however if you want this to complete anytime soon then this would be an option.
I read you question again, and I see you are askign for a recommened way.
To answer that, there is no recommended way, it is more up to us the team developing the solution to find an approach that fits.
As you can see from above Quan and Manoj have approached this in 2 different ways, and I myself would also have approached this different if I had all the requirements.
Therefore choose the solution that meets your requirements, follow the recommended way as much as possible, however sometimes we need to break the recommendation in order to fit our requirements.
Paul
Hello,
We are creating an API for an external health check service that we use. For that we would need to create a custom API that receives a start datetime and end datetime. The API then returns a JSON with all of the Episerver jobs ran during the timeframe, with the information wether they we successful or failed and for example their runtime.
The issue is that the required logs don't seem to be easily available. For example the IScheduledJobRepository has only the last job run available. There is also IScheduledJobLogRepository available but that only seems to have GetAsync available that requires a start index and max count. I could seemingly use that one to fetch everything from the database and then filter the fetched logs to match the given timeframe. This however doesn't seem a smart way to do this and it will probably be slow (and get even slower with time)
How would you recommend this to be implemented