Try our conversational search powered by Generative AI!

Henrik Fransas
May 27, 2015
  5250
(2 votes)

Automatic schema updates with backup for SQL Azure

From version 8.7 of EPiServer it is now possible to allow EPiServer to make schema changes to the database automatically when there are any in the upcoming version of EPiServer. You can read more about it here:

http://world.episerver.com/documentation/Items/Developers-Guide/EPiServer-CMS/8/Deployment/automatic-schema-updates/

This is great news because this has always been a problem when deploying to production environment where you might not have access to run sql-script on the database and it has also been hard to know when there are any schema changes and not. Now EPiServer can handle all that for us automatically and that is great, but it comes with a price. The things to consider I think is this:

  • Now the user MUST be dbo of the database, otherwise this will fail. Up to now it was possible to limit access to the database to be datawriter and datareader with execution rights on all objects of the dbo schema. This will probably not be a problem for most sites, since most are already has a user that is dbo connected to the database.
  • If it all runs automatically we do no know when it is extra important to make a backup of the database before deploying (I recommend that you do that for every deployment you do, otherwise it is very hard to roll back to the version before do deployment since EPiServer could have done a lot of data changes to the database on the first run of of the site).

Luckily for us EPiServer has thought of the second problem and made it possible for us to implement our own logic that should happen before the update happens. There are some great examples in the documentation where they show how to automatically create a backup file of the database before updating it. The problem for this when you have your database in Azure is that Azure SQL Database does not support backup the way you do it in a regular SQL Server.

The way Azure SQL Databases works is that Azure SQL Database automatically creates backups of every active database. Every hour a backup is taken and additionally, transaction log backups are taken every 5 minutes. This gives you opportunity to do just in time restore from the azure portal but it does not give you the possibility to trigger a full backup when you want. You can read more about it here:
https://msdn.microsoft.com/en-us/library/azure/jj650016.aspx

There are ways to work around the problem with not being able to do a normal backup but they are a little more tricky. The way I solve it is to do a copy of query and by that I will get a copy of the database so I will get my backup but not as a backup file. This is ok but there are some money issue into it because the cost for another database is much higher than for storage area, but luckily this is Azure so we only pay for the things we use when we use it, so this automatic step comes with a manual step, and that is to create a "backup file" of the copy of the database by creating a bacpac-file (https://msdn.microsoft.com/en-us/hh213241.aspx).

So lets show how to do this in code.

Start by upgrade your developer environment to the latest version of EPiServer. Run it locally and you will first see this error:

The database 'EPiServerDB' has not been updated to the version '7019.0', current database version is ....

This is expected behavior, so do not worry.

Start by changing your Episerver.Framework element tag to look like this:

<episerver.framework updateDatabaseSchema="true">

Then somewhere in your solution (preferably in the folder where you have any other Initialization stuff) create a new class that implements IDatabaseSchemaValidator. It should on first creation look something like this:

using System;
using System.Configuration;
using EPiServer.Data.SchemaUpdates;

namespace AutomaticSchemaUpdateWithAzureDBBackup.Business.Initialization
{
    public class BackupDatabaseValidator : IDatabaseSchemaValidator
    {
        public bool IsDatabaseUpdateAllowed(ConnectionStringSettings connectionStringSettings)
        {
            throw new NotImplementedException();
        }

        public void BeforeUpdating(ConnectionStringSettings connectionStringSettings)
        {
            throw new NotImplementedException();
        }
    }
}

Important!
You also need to tell StrutureMap to use your class, do that by adding something like this to your dependency setup:

container.For<IDatabaseSchemaValidator>().Use<BackupDatabaseValidator>();

As you can see there are two functions that needs to be implemented. IsDatabaseUpdateAllowed that will tell EPiServer if it is allowed to update the schema or not, and BeforeUpdating that can do what ever you want and that is happening before the update of the schema occur, so this is here we will put our logic to backup the database.

Start by allowing the schema to be automatically updated by changing IsDatabaseUpdateAllowed to always return true.

Then in BeforeUpdating we start by checking if the database is in Azure, and that we do by looking at the address to the database like this:

var sqlConStrBuilder = new SqlConnectionStringBuilder(connectionStringSettings.ConnectionString);
if (sqlConStrBuilder.DataSource.ToLowerInvariant().Contains(".database.windows.net,"))
{
    //Do action here
}

This is a pretty simple check and might not be bulletproof but for now all connectionstrings I have used to Azure SQL Database have that string in the server-path.

Right now you should be able to get your development environment to work, so if you compile and run the site, the error message should be gone.

To the problem with Azure and backup. To be able to do a copy of the database you need to be dbo in the database you are copying from and you also need to be able to create databases and hopefully the user you are using do not have that access rights (otherwise you are pretty much running your site with the sa account and that are a big no no!!!) so we need to add another connectionstring to the site. The important thing here is that you do not write any sensitive information in your web.config because you will not need it. Instead you add this connectionstring to the config tab of your site in Azure and by doing that, the information are protected and will not be visible to anyone (Azure are replacing the connectionstrings you define in web.config with the one from the config-tab in runtime).

After doing that we update the class so it looks like this:

using System;
using System.Configuration;
using System.Data.SqlClient;
using EPiServer.Data.SchemaUpdates;

namespace AutomaticSchemaUpdateWithAzureDBBackup.Business.Initialization
{
    public class BackupDatabaseValidator : IDatabaseSchemaValidator
    {
        public bool IsDatabaseUpdateAllowed(ConnectionStringSettings connectionStringSettings)
        {
            return true;
        }

        public void BeforeUpdating(ConnectionStringSettings connectionStringSettings)
        {
            var sqlConStrBuilder = new SqlConnectionStringBuilder(connectionStringSettings.ConnectionString);
            if (!sqlConStrBuilder.DataSource.ToLowerInvariant().Contains(".database.windows.net,")) return;

            var azureConnectionWithCreateDbAccessRights = ConfigurationManager.ConnectionStrings["azureConnectionWithCreateDbAccessRights"];
            var episerverConnectionString = ConfigurationManager.ConnectionStrings["EPiServerDB"];
            if (azureConnectionWithCreateDbAccessRights != null)
            {
                var builder = new SqlConnectionStringBuilder(episerverConnectionString.ConnectionString);
                using (var connection = new SqlConnection(azureConnectionWithCreateDbAccessRights.ConnectionString))
                {
                    connection.Open();
                    var queryString = string.Format("CREATE DATABASE [{0}_{1}] AS COPY OF [{0}]", builder.InitialCatalog, DateTime.UtcNow.ToString("yyyyMMdd_HHmmss"));
                    var command = new SqlCommand(queryString, connection)
                    {
                        CommandTimeout = 600
                    };
                    command.ExecuteNonQuery();
                }
            }
        }
    }
}

If you look at this, I set the timeout pretty high because it can take a while to get the response back. The command are said to be async and are supposed to return fairly quickly but in my tests with a alloy site it has taking between 1 and 2 minutes for it to respond.

Then all you have to do is wait...... So, doing this on a staged eviroment is good, since the upgrade will take longer time to complete because it need to also copy a database.

When all this are done and you see that it is all working, create a bacpac file of your copy and save it and then delete the database so you do not need to pay for it any more.

May 27, 2015

Comments

K Khan
K Khan May 28, 2015 09:21 AM

Thanks for Sharing Henrik!
I am wondering can we schedule auto-builds? e.g. I will like this process only in working days and timingsto deal with unseen issues
Will site remain available during these changes?

Regards
/K

Henrik Fransas
Henrik Fransas May 28, 2015 10:16 AM

This will only happen when you are doing a deploy after updateing EPiServers nuget packages so this will happen when you doing you deployment to the site.

The site will not be available during this time.

I thing Karoline Klever is better than me to say how to schedule deployments the best way

Please login to comment.
Latest blogs
Optimizely and the never-ending story of the missing globe!

I've worked with Optimizely CMS for 14 years, and there are two things I'm obsessed with: Link validation and the globe that keeps disappearing on...

Tomas Hensrud Gulla | Apr 18, 2024 | Syndicated blog

Visitor Groups Usage Report For Optimizely CMS 12

This add-on offers detailed information on how visitor groups are used and how effective they are within Optimizely CMS. Editors can monitor and...

Adnan Zameer | Apr 18, 2024 | Syndicated blog

Azure AI Language – Abstractive Summarisation in Optimizely CMS

In this article, I show how the abstraction summarisation feature provided by the Azure AI Language platform, can be used within Optimizely CMS to...

Anil Patel | Apr 18, 2024 | Syndicated blog

Fix your Search & Navigation (Find) indexing job, please

Once upon a time, a colleague asked me to look into a customer database with weird spikes in database log usage. (You might start to wonder why I a...

Quan Mai | Apr 17, 2024 | Syndicated blog