Hi Scott.
You should avoid uploading it to the App_Data folder of your webroot. Since it's in a scaleable environment, you are in theory not in control of which instance does upload vs. processing.
Instead, I would recommend to push it to Episerver own assets (requires you to support that extension) or directly to the Azure Blob Storage. It is possible to run a Azure Blob Storage emulation on your local machine to enable development experiments.
https://azure.microsoft.com/en-us/resources/samples/storage-blob-dotnet-getting-started/
Did you consider to accomodate your catalog push via the build in XML import in Episerver Commerce? It may be easier to comply with the XML schemas, since you'll then get all the import functionality for "free".
Hope that helped you take a decision on immidiate next steps.
/Casper Aagaard Rasmussen
Thanks that's what I thought, if I push it via the Episerver Blob Factory or use the asset mananger I can just make you of the already configured storage for accessing this data and read it in programatically.
I investigated the possibility of the catolog import but the strucutre of the commerce catalog is too complex for the tool to handle so we've had to go down the API route.
Thanks for the advice
Hi guys, I'm creating a plugin for the admin area of Episerver where I need to process information from an uploaded file to build part of a commerce catalog. Normally that would be fine but as this sites on the DXC which can be auto scaled as needed I was wondering if there's any issues with this.
I was looking at saving it locally to the app_data folder but in a traditional load balancing scenario that would have to be a replicated or shared path. Obviously Episervers own content is help as azure blobs when on the DXC so I was considering having it uploaded in to a assets folder instead then programatically loading it.
Any advice?