Elias.Lundmark
Mar 21, 2024
  2763
(2 votes)

Keeping local environments in sync with your Cloud environments

We recently announced that we are improving the scalability of SQL databases in DXP Cloud Services, this new architecture also enhances our overall security for SQL databases where we are aiming to harden technical controls to maintain confidentiality and integrity of our customers data. This change had an unintended consequence though – it disallows developers from connecting local development environments directly to SQL databases in DXP Cloud Services. We strongly advise against this practice, while the ease of use and flexibility is great, manually managing and storing connection strings and credentials for service users greatly increases the risk of these credentials falling into the wrong hands, allowing potential attackers to access or modify data.

To avoid these risks, our new architecture disallows direct connections from third-party sources to SQL Servers running in DXP Cloud Services. Instead, you should use the paasportal or the API to export your databases and content to use in your local development environments, which are more secure and reliable methods.

How to export content

Via the paasportal

  1. Navigate to https://paasportal.episerver.net and select the project you wish to export a database from
  2. Navigate to the Troubleshoot tab
  3. In the ‘Export Database’ section, select the environment you wish to export the database from, and how long the paasportal should retain this copy.
  4. Once the export is done, click the database file to download it as .bacpac. These files can then be used to import your database to a local SQL server, or an Azure SQL Server.

Via API with Powershell

  1. Navigate to https://paasportal.episerver.net and generate credentials as described here https://docs.developers.optimizely.com/digital-experience-platform/docs/authentication.
  2. Authenticate woth Connect-EpiCloud, Connect-EpiCloud -ClientKey <ClientKey> -ClientSecret <ClientSecret> -ProjectId <ProjectId>
  3. Start a database export with Start-EpiDatabaseExport, for example Start-EpiDatabaseExport -Environment Integration -DatabaseName epicms -Wait
  4. Fetch the download link for the .bacpac with Get-EpiDatabaseExport

Via the API you can also download BLOBs from the storage account, where Get-EpiStorageContainer allows you to list all storage containers and GetEpiStorageContainerSasLink creates a SAS URI that can be used to download BLOBs. For example,

 Get-EpiStorageContainerSasLink -ProjectId "2372b396-6fd2-40ca-a955-57871fc497c9" `

  -Environment "Integration" `

  -StorageContainer "mysitemedia" `

  -RetentionHours 2

Mar 21, 2024

Comments

Drew Douglas
Drew Douglas Mar 21, 2024 07:55 PM

This change to the accessibility of the SQL instances in the Integration environment is disappointing. Prior to us joining the project Opti Expert Services set up one of our customers with a development model that strongly prefers connecting to the Integration databse when running the solution locally. We've run successfully with local databases, but this change to DXP will require us to change messaging and other systems to keep local databases in sync with backend systems.

Eric
Eric Apr 3, 2024 09:48 PM

Never thought you should connect to DXP db:s at any point at all actually. Using client data should not be needed for development purpose. BUT if you use this and download a database a Disclaimer could be handy that if you download a client db you most likely will be deeling with PI data and therefore might be distributing information as a developer that your company most likely do not like you todo in case of a breach..

it’s crucial to stay informed and understand the ins and outs of personal data before downloading a client database is at least my opinion and if so have scripts ready to remove that information or have a Data Processing Agreement in place.. :) 

Please login to comment.
Latest blogs
We Cloned Our Best Analyst with AI: How Our Opal Hackathon Grand Prize Winner is Changing Experimentation

Every experimentation team knows the feeling. You have a backlog of experiment ideas, but progress is bottlenecked by one critical team member, the...

Polly Walton | Feb 16, 2026

Architecting AI in Optimizely CMS: When to Use Opal vs Custom Integration

AI is rapidly becoming a core capability in modern digital experience platforms. As developers working with Optimizely CMS 12 (.NET Core), the real...

Keshav Dave | Feb 15, 2026

Reducing Web Experimentation MAU Using the REST API

Overview Optimizely Web Experimentation counts an MAU based upon the script snippet rendering for evauluation of web experiement. Therefore when yo...

Scott Reed | Feb 13, 2026

Install the new AI Assistant Chat for Optimizely

AI Assistant Chat is a revolutionary feature introduced in version 3.0 of Epicweb.Optimizely.AIAssistant that brings conversational AI directly int...

Luc Gosso (MVP) | Feb 12, 2026 |