Keeping local environments in sync with your Cloud environments
We recently announced that we are improving the scalability of SQL databases in DXP Cloud Services, this new architecture also enhances our overall security for SQL databases where we are aiming to harden technical controls to maintain confidentiality and integrity of our customers data. This change had an unintended consequence though – it disallows developers from connecting local development environments directly to SQL databases in DXP Cloud Services. We strongly advise against this practice, while the ease of use and flexibility is great, manually managing and storing connection strings and credentials for service users greatly increases the risk of these credentials falling into the wrong hands, allowing potential attackers to access or modify data.
To avoid these risks, our new architecture disallows direct connections from third-party sources to SQL Servers running in DXP Cloud Services. Instead, you should use the paasportal or the API to export your databases and content to use in your local development environments, which are more secure and reliable methods.
How to export content
Via the paasportal
- Navigate to https://paasportal.episerver.net and select the project you wish to export a database from
- Navigate to the Troubleshoot tab
- In the ‘Export Database’ section, select the environment you wish to export the database from, and how long the paasportal should retain this copy.
- Once the export is done, click the database file to download it as .bacpac. These files can then be used to import your database to a local SQL server, or an Azure SQL Server.
Via API with Powershell
- Navigate to https://paasportal.episerver.net and generate credentials as described here https://docs.developers.optimizely.com/digital-experience-platform/docs/authentication.
- Authenticate woth Connect-EpiCloud, Connect-EpiCloud -ClientKey <ClientKey> -ClientSecret <ClientSecret> -ProjectId <ProjectId>
- Start a database export with Start-EpiDatabaseExport, for example Start-EpiDatabaseExport -Environment Integration -DatabaseName epicms -Wait
- Fetch the download link for the .bacpac with Get-EpiDatabaseExport
Via the API you can also download BLOBs from the storage account, where Get-EpiStorageContainer allows you to list all storage containers and GetEpiStorageContainerSasLink creates a SAS URI that can be used to download BLOBs. For example,
Get-EpiStorageContainerSasLink -ProjectId "2372b396-6fd2-40ca-a955-57871fc497c9" `
-Environment "Integration" `
-StorageContainer "mysitemedia" `
-RetentionHours 2
This change to the accessibility of the SQL instances in the Integration environment is disappointing. Prior to us joining the project Opti Expert Services set up one of our customers with a development model that strongly prefers connecting to the Integration databse when running the solution locally. We've run successfully with local databases, but this change to DXP will require us to change messaging and other systems to keep local databases in sync with backend systems.
Never thought you should connect to DXP db:s at any point at all actually. Using client data should not be needed for development purpose. BUT if you use this and download a database a Disclaimer could be handy that if you download a client db you most likely will be deeling with PI data and therefore might be distributing information as a developer that your company most likely do not like you todo in case of a breach..
it’s crucial to stay informed and understand the ins and outs of personal data before downloading a client database is at least my opinion and if so have scripts ready to remove that information or have a Data Processing Agreement in place.. :)