November Happy Hour will be moved to Thursday December 5th.
AI OnAI Off
November Happy Hour will be moved to Thursday December 5th.
You can write code to export one by one, to json files for example, then on target import content one by one. however IMO import is the most simple and performant way to move large amount of data. it's one time thing to allow big upload so I'd have no problem for that, especially if you do the import only on the scheduler instance which further limit security risks
Hello,
I am seeking advice on the export/import of a site.
We have a multisite application hosted in Episerver. One of the sites is undergoing a revamp in terms of look and feel (no code changes), which is why the business wants to completely replicate the site. We have two approaches:
However, we are facing a problem with the export/import process. The site is quite large, with more than 20,000 content items. While exporting such a large site is manageable, the issue arises during import, where we encounter a "Request is too Large" error.
We understand that there is an import limit, which in our case is not more than 200MB, while the exported file size is close to 900MB. Although Optimizely has informed us that the upload limit can be extended, doing so would introduce significant risks for this one-time activity, which we would like to avoid.
So my question is: What approach should we take to duplicate the site without relying on export/import? Is there a better method to handle this situation?
I would greatly appreciate any guidance on the best way to proceed.
Regards.
In our application, we have multiple themes for different websites. Changing the theme property alters the entire look and feel of the site. This is how the site is built. For example, if Site A currently uses Theme A, we want to revamp it by changing Site A to Theme B.
I’m inclined to duplicate the site directly in the PROD environment. This approach would involve only one export/import process within the PROD environment rather than performing the same task twice. The reason for duplicating the site is to avoid tampering with the live site. Instead, we would work on the duplicated site until it is finalized.