As an example you might have the sitemap.xml being updated once a day. The problem is it will only update on the one server that ran that job.
Hi Dave,
As far as I'm aware, you can't set an Episerver scheduled job to run on all servers but you do have other options available to you.
In my view, the beter of the options would be to use centralised resources such as blob storage to store your generated files then have each server read from blob storage (possibly cached locally if response time is a concern - the inbuilt Episerver cache will handle remote cache invalidation). This approach avoids duplicating work on each of the servers.
If you want to run a task on all servers I think you'd need to take a look at 3rd party libraries such as quartz.net or hangfire.
Ok thanks for your answer. I might try and write a file into the cms media area that they can use if need be. Might be easier than the blob storage, but that is an option too. My main thing was to use cache first, then a fall back file, and web service last. In case the web service stops or the server goes down with that on, I then still have a good file I can use until the web service server is back up.
If budget allows, it may be worth you taking a look at the Azure Redis Cache service as that gives you a persistent cache accessible from all servers so you wouldn't necessarily need the fallback file.
Try
Server to server storage replication.
https://docs.microsoft.com/en-us/windows-server/storage/dfs-namespaces/dfs-overview
in dxc (or in cloud infra in general) i would try to avoid any server specific local file features. just because you might be "re-located" to different physical server (due to some failovers in azure or whatever reasons) and then your app becomes in invalid state until your job is ran and local files are produced. the most easiest way (and used that myself to "sync" up multiple servers) is to use azure storage tables and/or blob storage. much cheaper than redis cache and have VERY SIMILAR performance characteristics.
I'll look into the DXC azure blob storage. Episerver limit our access to anything but integration so hard to easily see if what I'm doing would work.
well, if they provide with connection to particular storage account (it's needed for blobs anyway) - and you can access to that storage (even via code) it should work
Episerver refuse to let us have access to preproduction and production azure blob storage areas. There must be some way to use them though otherwise it's pointless having them.
As Valdis mentioned, if you want to access the BLOB storage directly, the connection strings will be in your web.config (probably as "EPiServerAzureBlobs") as the Episerver BLOB provider will need access to it. As I recall though, Episerver doesn't recommend using the DXC BLOB storage account directly as they may periodically update/migrate/move/recreate the storage and they'll won't include anything which diverges from their standard structure (though I can't find any documentation to back that up). I think their suggestion was to create your own BLOB storage account outside of DXC for any customised uses of BLOB storage which go beyond what Episerver APIs make available.
The alternative to accessing the BLOB storage directly is to use the BLOB storage provider which provides a wrapper around the underlying BLOB storage as described here:
https://world.episerver.com/documentation/developer-guides/CMS/blob-storage-and-providers/
Is there a way to have all servers in a multi-server environment (DXC production) to run a scheduled job? At the moment only one server at random appears to run the job.
https://world.episerver.com/documentation/developer-guides/CMS/scheduled-jobs/
I have some jobs that generate local XML files and application caches, but I want that to apply to all servers when the job runs not just the one that managed to run the job.