November Happy Hour will be moved to Thursday December 5th.

Bulk/Batch insertion of a high number of prices - Running out of memory/cpu

Vote:
 

Hi everybody.

I need som input on the best way of updating a very high amount of prices.

We have a solution which contains around 200.000 variants, and several times a year our customer updates all prices in the solution. Seing as we have around 50 markets we end up with having to update between 10.000.000 and 20.000.000 prices during such an update. 

Currently we are updating the prices in bulk by using the method PriceServiceExtensions.SetCatalogEntryPricesIfNotReadOnly(catalogKeys, priceValuesSubset); which we supply with a batch of variants and prices. This works ok when we only update a few million prices, but when we run it on our entire catalog, the commerce database is being stressed to its limits, and we also experience an increased memory usage - often to the extend in which the server is restarted automatically (we are using the Episerver DXP hosted in Azure). As we are using the DXP we don't have the luxury of just increasing the CPU/Memory so we would like to optimize how we update the prices in bulk.

Do anyone have a more optimal way of updating prices in bulk?

Small note: we have been experimenting with different batch sizes - smaller sizes of around 5.000 lowers the stress on the database but takes more time and we still run out of memory - our default value is 50.000 prices at a time.

#229330
Oct 12, 2020 19:34
Vote:
 

Are you able to look into the queries to see which queries actually use the most database resource? it's natural to think the price update query to be the one to blame, but there might be more than that

#229415
Oct 14, 2020 7:43
* You are NOT allowed to include any hyperlinks in the post because your account hasn't associated to your company. User profile should be updated.