Try our conversational search powered by Generative AI!

Elias Lundmark
Mar 21, 2024
  586
(2 votes)

Keeping local environments in sync with your Cloud environments

We recently announced that we are improving the scalability of SQL databases in DXP Cloud Services, this new architecture also enhances our overall security for SQL databases where we are aiming to harden technical controls to maintain confidentiality and integrity of our customers data. This change had an unintended consequence though – it disallows developers from connecting local development environments directly to SQL databases in DXP Cloud Services. We strongly advise against this practice, while the ease of use and flexibility is great, manually managing and storing connection strings and credentials for service users greatly increases the risk of these credentials falling into the wrong hands, allowing potential attackers to access or modify data.

To avoid these risks, our new architecture disallows direct connections from third-party sources to SQL Servers running in DXP Cloud Services. Instead, you should use the paasportal or the API to export your databases and content to use in your local development environments, which are more secure and reliable methods.

How to export content

Via the paasportal

  1. Navigate to https://paasportal.episerver.net and select the project you wish to export a database from
  2. Navigate to the Troubleshoot tab
  3. In the ‘Export Database’ section, select the environment you wish to export the database from, and how long the paasportal should retain this copy.
  4. Once the export is done, click the database file to download it as .bacpac. These files can then be used to import your database to a local SQL server, or an Azure SQL Server.

Via API with Powershell

  1. Navigate to https://paasportal.episerver.net and generate credentials as described here https://docs.developers.optimizely.com/digital-experience-platform/docs/authentication.
  2. Authenticate woth Connect-EpiCloud, Connect-EpiCloud -ClientKey <ClientKey> -ClientSecret <ClientSecret> -ProjectId <ProjectId>
  3. Start a database export with Start-EpiDatabaseExport, for example Start-EpiDatabaseExport -Environment Integration -DatabaseName epicms -Wait
  4. Fetch the download link for the .bacpac with Get-EpiDatabaseExport

Via the API you can also download BLOBs from the storage account, where Get-EpiStorageContainer allows you to list all storage containers and GetEpiStorageContainerSasLink creates a SAS URI that can be used to download BLOBs. For example,

 Get-EpiStorageContainerSasLink -ProjectId "2372b396-6fd2-40ca-a955-57871fc497c9" `

  -Environment "Integration" `

  -StorageContainer "mysitemedia" `

  -RetentionHours 2

Mar 21, 2024

Comments

Drew Douglas
Drew Douglas Mar 21, 2024 07:55 PM

This change to the accessibility of the SQL instances in the Integration environment is disappointing. Prior to us joining the project Opti Expert Services set up one of our customers with a development model that strongly prefers connecting to the Integration databse when running the solution locally. We've run successfully with local databases, but this change to DXP will require us to change messaging and other systems to keep local databases in sync with backend systems.

Eric
Eric Apr 3, 2024 09:48 PM

Never thought you should connect to DXP db:s at any point at all actually. Using client data should not be needed for development purpose. BUT if you use this and download a database a Disclaimer could be handy that if you download a client db you most likely will be deeling with PI data and therefore might be distributing information as a developer that your company most likely do not like you todo in case of a breach..

it’s crucial to stay informed and understand the ins and outs of personal data before downloading a client database is at least my opinion and if so have scripts ready to remove that information or have a Data Processing Agreement in place.. :) 

Please login to comment.
Latest blogs
From Procrastination to Proficiency: Navigating Your Journey to Web Experimentation Certification

Hey there, Optimizely enthusiasts!   Join me in celebrating a milestone – I'm officially a certified web experimentation expert! It's an exhilarati...

Silvio Pacitto | May 17, 2024

GPT-4o Now Available for Optimizely via the AI-Assistant plugin!

I am excited to announce that GPT-4o is now available for Optimizely users through the Epicweb AI-Assistant integration. This means you can leverag...

Luc Gosso (MVP) | May 17, 2024 | Syndicated blog

The downside of being too fast

Today when I was tracking down some changes, I came across this commit comment Who wrote this? Me, almost 5 years ago. I did have a chuckle in my...

Quan Mai | May 17, 2024 | Syndicated blog

Optimizely Forms: Safeguarding Your Data

With the rise of cyber threats and privacy concerns, safeguarding sensitive information has become a top priority for businesses across all...

K Khan | May 16, 2024