Try our conversational search powered by Generative AI!

Large Content Export stalling

Vote:
 

Hey, 
For a client i need to do a larger content export of 100k+ content items, and it stalls after ~80k(same exact amount every time) items everu time. 
There are no log entries about this at all, and already have the following set: 
maxAllowedContentLength="154857600"  
maxRequestLength="302400" 

Any ideas on how to due this ? and where is the package stored on server while being generated ? I see a massive amount of diskspace being used, but cant seem to find the temp-files. 

Regards, 
Klaus

#308770
Sep 20, 2023 13:06
Vote:
 

If you can expoert locally, running a profiler like dotTrace would reveal a lot of information, including why it stalls

I'd be happy to take a look at the trace 

#308775
Sep 20, 2023 13:44
Vote:
 

Unfortunantly I can not do it locally, as we dont have much content there. 

#308821
Sep 21, 2023 7:17
Vote:
 

In theory you can grab a production db and do the test locally.

but the stall is likely caused by a Garbage collection, maybe gen 2. do you have something like profiler trace enabled, or can you capture memory dump?

all in all this should be handled via our support channel 

#308822
Sep 21, 2023 7:20
* You are NOT allowed to include any hyperlinks in the post because your account hasn't associated to your company. User profile should be updated.