November Happy Hour will be moved to Thursday December 5th.
AI OnAI Off
November Happy Hour will be moved to Thursday December 5th.
var content = CreateMultipartFormDataContent("Catalog.zip.001"); var uploadFile = JsonConvert.DeserializeObject<UploadFile>(Put("episerverapi/commerce/import/upload/chunk", content). Content.ReadAsStringAsync().Result); uploadId = uploadFile.UploadId; content = CreateMultipartFormDataContent("Catalog.zip.002"); var file2 = Put(String.Format("episerverapi/commerce/import/upload/chunk/{0}/{1}", uploadId, (1024 * 2) * 1), content); content = CreateMultipartFormDataContent("Catalog.zip.003"); var file3 = Put(String.Format("episerverapi/commerce/import/upload/chunk/{0}/{1}", uploadId, (1024 * 2) * 2), content); content = CreateMultipartFormDataContent("Catalog.zip.004"); var file4 = Put(String.Format("episerverapi/commerce/import/upload/chunk/{0}/{1}", uploadId, (1024 * 2) * 3), content); content = CreateMultipartFormDataContent("Catalog.zip.005"); var file5 = Put(String.Format("episerverapi/commerce/import/upload/chunk/{0}/{1}", uploadId, (1024 * 2) * 4), content); var complete = Post(String.Format("episerverapi/commerce/import/upload/commit/{0}", uploadId), new FormUrlEncodedContent(new List<KeyValuePair<String, String>>())); var state = GetBackgroundTaskState(String.Format("episerverapi/commerce/import/catalog/{0}", uploadId), new FormUrlEncodedContent(new List<KeyValuePair<String, String>>())).Result protected static MultipartFormDataContent CreateMultipartFormDataContent(string fileName) { var content = new MultipartFormDataContent(); var filestream = new FileStream(Path.Combine(TestDataDirectory, fileName), FileMode.Open); content.Add(new StreamContent(filestream), "file", fileName); return content; } protected static async Task<JobStatus> GetBackgroundTaskState(string url, FormUrlEncodedContent content) { var msg = await TestServer.HttpClient.PostAsync(url, content); var returnString = await msg.Content.ReadAsStringAsync(); returnString = returnString.Replace("\"", ""); Guid taskId; if (Guid.TryParse(returnString, out taskId)) { return WaitForCompletion(taskId); } return null; } protected static JobStatus WaitForCompletion(Guid taskId) { const int delayIncrement = 5; const int maxDelay = 100; JobStatus status; var delay = 5; bool done = false; var deadline = DateTime.Now + TimeSpan.FromMinutes(30); var taskManager = GetInstance<JobManager>(); do { status = taskManager.GetJob(taskId); if (status != null) { done = status.State == JobMessageType.Success || status.State == JobMessageType.Error; } if (!done) { Thread.Sleep(delay); delay += delayIncrement; if (delay > maxDelay) { delay = maxDelay; } } if (DateTime.Now > deadline) { throw new Exception("The requested task timed out."); } } while (!done); return status; }
Thanks. This is very detailed. I can't check it out just now. Based on your code it seems that all PUT methods can be uploaded in parallel?
I am trying to figure out the chunk upload endpoint to uplad big zip file to commerce. That one uploads directly to commerce site? Right?
There is a nice example that shows usage of many other endpoints here
There is also an explanation on how it should be done and the example of model used here
What I can't wrap my head around is: what holds the data in the proposed model in the last link. If that model was not complete, would it be my responsibility to actually split the file in array of byte arrays and then to pass them in loop to the endpoint
episerverapi/commerce/import/upload/chunk/{uploadId:guid=00000000-0000-0000-0000-000000000000}/{offset:int=0}
If it woud be my responsibility am I correct to asume that offset would be number of bytes multiplied by the number of chunks uploaded so far and that guid would change based on each response?