SaaS CMS has officially launched! Learn more now.

Best solution for accessing external databases in DxC hosted site?


We're in the process of moving our site into the DxC environment hosted by Episerver, however we're not allowed to host any of our custom databases alongside our Episerver CMS database. Would be curious to hear what solutions anyone else here has used to overcome this limitation.


  1. Custom tables - this provides the closest proximity to the Episerver CMS database and reduces latency, however we would need to create an entire CRUD process to keep this data in sync with actual database it comes from. Using custom replication software (as we do now) isn't allowed on the DxC either. I've the documentation around Fluent Migrator and using Initialization Modules, or EF Code First Migrations, however those seem to only assist with the schema aspect of the data. Also, I don't believe Episerver would allow our stored procedures to be placed in the same database as these custom tables (but not totally sure there.)

  2. Azure Database - we could spin up our own Azure instance and upload our custom databases there. Obviously there are some additional expenses here, but our bigger concern in this scenario is that the network latency could become an issue as we cannot place these custom databases in the same instance that our Episerver CMS database lives in. 

  3. API Rewrite - instead of using a database at all, we could all an API to retrieve the data directly and store it, then update the data at some reasonable interval. Of course, the effort of writing all of this isn't a small task, and we'd also need to run the numbers to ensure we don't get too close to our data provider's service limits with all the API calls.

  4. Don't use the DxC - continue with our hosted version of Episerver on a VM and connecting to our database servers in that same VM cluster, hoping Episerver eventually finds a better option for custom databases in their DxC. Of course, here we lose out on all the features of the DxC, which is less than preferable as well.

Anything else we should be considering? 

Jan 21, 2020 20:49

Hi Brian

It is true that you don't get additional databases when hosting websites in DXC. But you can always add custom table to your Episerver database, using whatever ORM you are confident with and you can create the Stored Procedures you like. If you go with SPs you just need to send the create scripts to Episerver Service Desk before deploying your code.

However, I do not recommend that you go down that route, since you you will be storing duplicates of internal data in the website's database. Besides having to implement continous replication of data from your custom database, you also need to consider whether you would be storing secret or Personally Identifiable Information in those tables.

In your case, I would definitely recommend wrapping your custom database in an API, and use it as a single source of truth. As you mention, developing such new APIs will require a lot effort. But building a good replication mechanism will also require a lot of effort. And so it is often better and safer to not replicate at all.

If you use the API approach, you can cache the loaded data for some time in Episerver. This will speed up load times a lot and conserve your API call allowance.

Jan 22, 2020 5:58

Personally I've worked with solution with databases in seprate Azure Instances. As long as you try to keep the database in the same region there shouldn't be any latency issues in my experience, the DXP (formally DXC Cloud) is after all just built on top of standard Azure resources. So IMO that's the easiest solution, then you can asess the impact of migrating in to the Epi database all that add a dependancy layer on that DB so depending on your commitment to episerver as a platform I wouldn't add them in to the epi database unless you have to. I'd also make sure to press the DXP team as they sometimes will allow you to add in custom elements at a cost although this may be higher that the standard Azure free.

Edited, Jan 22, 2020 9:25
Will Wiebalck - Mar 04, 2020 21:36
Hello Scott,

Thank you for the reply to our issue. I work with Brian and, unfortunately, we won't be able to provision a Microsoft Azure environment and migrate our third-party database resources there in time for our launch into the DXP to have resources in the same zone. Episerver also won't let us use Azure ExpressRoute between our data centers. Our only two options at this point are to 1) use public internet for the database connection string or 2) not launch into the DXP at all, and extend our on-premise license that has the website and third-party database in the same data center. With your experience with Azure databases, I was wondering what your thoughts would be on going with a public internet connection route?

Thank you so much!

Will Wiebalck

Agree with @stefan. I'd go with your option 3 and get the data via API as it will give you more possibilities in the future. It might take a lot effort to add the APIs and fetch the data, but replicating or mirroring databases will require additional infrastructure to maintain and a lot of effort too.

Jan 22, 2020 21:47

Thank you everyone for your thoughtful feedback and ideas.

Based on our timeline and other constraints, we're planning to go the Azure route for the database hosting and use our same third-party service to handle the replication and syncing with the cloud source. The API piece is certainly on our roadmap though, as that would be the ideal solution.

Scott, thanks for sharing that - we had overlooked the region setting and it seems that if we match the region our Episerver site and database is being hosted in, latency should be negligible. Great tip.

Jan 24, 2020 16:11
This topic was created over six months ago and has been resolved. If you have a similar question, please create a new topic and refer to this one.
* You are NOT allowed to include any hyperlinks in the post because your account hasn't associated to your company. User profile should be updated.