Launch your database into Azure SQL Database – bacpac edition

You have decided to migrate our on-premise SQL Server database into Azure SQL Database. How do you do that? The first thought that comes to mind is a simple backup from on-premise and restore to Azure SQL Database. Unfortunately, that is not supported by Microsoft. Instead, there are three supported methods of migrating into Microsoft’s Platform as a Service (PaaS) offering.

  1. Export and restore from a bacpac file (data and schema).
  2. Export and deploy a dacpac file (schema only) and then ETL your data.
  3. Transactional Replication

This post will cover method #1, exporting and restoring from a bacpac file.

Migrating to Azure SQL Database

The result of this method is that a bacpac file will live on Azure Blob Storage and be imported into a logical server which will house your new Azure SQL Database. There are three paths to follow, however. All three will be covered in this post. I have sorted them from most GUI to least GUI.

Tools and versions

This method focuses on a feature of SQL Server Management Studio (SSMS) and the DAC framework. For the demonstration, SSMS v17.0 and the DAC framework packaged with SQL Server vNext (2017) will be used. Also, it is a SQL Server vNext database being exported. These versions, however are not the minimum requirements.

The DAC framework versions prior to 2016 can export databases going back to SQL Server 2000. The 2016+ versions of the DAC framework support SQL Server 2005+. I expect this type of rolling deprecation of support to continue as the framework continues into the future.

The first version of SSMS which supported exporting bacpacs was packaged with SQL Server 2008 R2. While that version technically supports some of these methods, there are few reasons not to use the latest version of SSMS. Of the reasons that do exist, my recommendation is to use the latest version for all functions possible, including these functions, and only use the legacy version for your specific need.

Deploy Database to Microsoft Azure SQL Database

If only a few databases need to migrate to Azure SQL Database, this method is the most convenient. It is the, “do it all in one go,” method.

Building the SQL Server

With Azure SQL Database, you do not have a server, like you do with Infrastructure as a Service (IaaS). However, you do need a logical server for you to connect to and for your database(s) to be attached to.

Log into the Azure portal. Click the New button on the left navigation bar. Search SQL Server and select SQL server (logical server). Click Create one last time to bring up the creation blade.

The create server blade will now be open. Fill out the sections as follows.

  1. The server name must be unique and is the name that you will specify in all Azure SQL Database connections.
  2. A SQL authentication, sysadmin, user with a strong password is required for all Azure SQL Databases. Windows authentication only is not an option.
    1. Azure Active Directory access can be configured but that is out of scope for this post.
  3. Select the necessary subscription. Typically, organizations will have more than one and you may have access to more than one. Make sure to pick the correct one because it will likely affect how your organization deals with accounting.
  4. A resource group is a container to group together related Azure resources. Create new works, if you do not have an existing group to assign this server to. I had already created a resource group, though, so I selected Use existing.
  5. Pay careful attention to the Location. There is a number of ways that Microsoft makes money from Azure. One of them is data egress (data leaving a particular Azure data center). If you end up with multiple Azure resources in different Azure regions, you might end up paying more than necessary because all network traffic internal to Azure’s data center is free.
  6. Click Create.

It takes about five minutes for the server to build. Next the client IP address will need to be whitelisted to connect to it.

  1. Select the firewall page when scoped to your SQL Server.
  2. I typically set allow access to Azure services to OFF by default and then only turn it back on, if needed.
  3. This shows the public IP address of the network you are using.
  4. Create a firewall rule which allows access to the IP address or range that you need.
  5. Do not forget to click Save.

Exporting the bacpac via SSMS wizard

Right-click on the database you want to export and then select Tasks, followed by Deploy Database to Microsoft Azure SQL Database.

The wizard has three major areas to populate.

3. Specify a temporary location to write the bacpac file. Make sure there is enough disk space to hold the entire database uncompressed.

2. Select your pricing tier. What tier to choose is out of scope for this post but it is important to know that the pricing tier will impact the speed of the bacpac import. If you have a large database, it would be wise to choose a high pricing tier for the import and then reduce the tier after it is complete to meet the throughput needs.

1. The connection string for the logical SQL Server is <servername>.database.windows.net using port 1433. This can also be found in the properties page of the logical SQL Server in the Azure portal, as well.

Completing the process just requires clicking Next, Finish, and waiting.

Export Data-tier Application

It is time to move on to the second method of exporting bacpac files. This method is similar to the first method. A bacpac file is exported to a temporary local directory. Then it is uploaded directly to Azure.

It is more work than the first method but it is useful for organizations with separation of duties within their Azure subscriptions. For example, a non-privileged user might have access to export and upload a bacpac file but need a DBA to import it into the correct SQL Server and set the pricing tier.

Create a storage account

Similar to creating a SQL Server, use the New button to open the new resource blade and then search for storage. Select Storage account – blob, file, table, queue and then click Create.

The create storage account blade will now be open. Fill out the sections as follows.

  1. The name of the storage account is globally unique and will be part of your URLs later in this post.
  2. Always select Resource manager. Microsoft is working away from the classic portal and resource manager should always be preferred.
  3. Select General purpose storage account type. It is the only supported option for this process.
  4. Select Standard Premium storage is not supported for this method.
  5. For a bacpac export, you should not need any disaster recovery options, so I recommend LRS.
  6. Encrypting the company’s data is always good. Enable both.
  7. Chose the same subscription, resource group, and Azure region that was selected for the SQL Server.
  8. Click Create.

After a few minutes, the storage account will be ready for use.

The storage account is analogous to a SAN or storage array. A container is analogous to a LUN or drive and then folders can be created within a container. A container must be created to accept the bacpac export.

Navigate to the storage account. This can be accomplished in a number of ways. I prefer to search for its name in the top menu bar or navigate to it via the resource groups button on the left navigation bar. Once there, go to the container page and click +Container.

Name your container, it must be unique within the storage account. Set access type to private so that only users with the proper access key will be able to read and/or write. Then, click OK.

Access key

Before jumping into the export data-tier application wizard, copy down the storage account key. In the Azure portal, navigate to your storage account and view the Account Keys page. Copy down either of the two keys.

SSMS wizard

Swap back to SSMS to export a data-tier application. Verify that there is enough disk space to copy the entire uncompressed database and then right-click on the database and select Tasks > Export Data-Tier Application.

Populate the Settings tab of the wizard as follows.

  1. Connect to the storage account.
    1. Paste in the key from above here.
  2. Select the container from the drop-down box and choose a name for the bacpac file.
  3. Choose the local directory for the temporary file.

It is possible to export the bacpac file locally without transmitting it to Azure. It would then have to be copied to an Azure storage account by the user. Copying to Azure storage won’t be covered in this section but will be covered later in the command line interface method.

If the database is to be uploaded only in part, the advanced tab allows for schemas and/or objects to be deselected.

After clicking next > finish, the bacpac file will be transmitted into Azure.

Import bacpac into Azure SQL logical server

In the Azure portal, navigate to the logical SQL Server and select Import database from the Overview page.

  1. Select the uploaded bacpac from the storage account and container where it resides.
  2. Choose the necessary pricing tier.
  3. Name the database.
  4. Populate the logical SQL Server admin user and password.
  5. Click OK.

When the import is complete, the database will show up in the SQL Server’s database list.

Command line interface (CLI)

The recommended method for working with Azure is always PowerShell. The Azure portal and SSMS are tools there for your convenience but they do not scale well. If you have multiple databases to migrate, potentially from multiple servers, using PowerShell will be much more efficient. Scripting your Azure work makes it repeatable and works towards the Infrastructure as Code concept.

In this demonstration, the below steps will be used.

  1. Export the bacpac file to a local directory with sqlpackage.exe.
  2. Copy the bacpac to Azure Blob Storage with AzCopy.exe
  3. Use the PowerShell AzureRM module and cmdlets to create an Azure SQL Database from the bacpac file.

SqlPackage.exe

Exporting your bacpac is quite simple. With the DAC framework installed, the sqlpackage.exe utility can be called with the Export action type. The below script will export the bacpac file to a local directory. The console output is fairly verbose and shows the order of operations between exporting the schema and then the data.

# Export bacpac file to local directory

$DACversion = "140"
$DACpath = "C:\Program Files (x86)\Microsoft SQL Server\$DACversion\DAC\bin"
$ServerInstance = "SQLHAMMERLAPTOP\SQL2017CTP20"
$Database = "AdventureWorks2014"
$OutputFileName = "AdventureWorks2014.bacpac"
$OutputDir = "C:\Backups\"
$OutputPath = Join-Path $OutputDir $OutputFileName

Set-Location $DACpath
.\sqlpackage.exe /Action:Export /SourceDatabaseName:$Database /SourceServerName:$ServerInstance /TargetFile:$OutputPath /OverwriteFiles:True

NOTE: I will be reusing the variables in my next couple of scripts without re-instantiating them. The full script is available for download at the end of the article.

AzCopy.exe

AzCopy.exe is utility designed to copy objects to, from, and between Azure Storage Accounts. The syntax is straight forward and the documentation is full of examples for various use cases. The below script uses this utility to copy the bacpac file from the local directory to Azure Blob Storage.

# Copy bacpac file to Azure storage account

# Download and install Microsoft Azure Storage Tools
# <a href="http://aka.ms/downloadazcopy">http://aka.ms/downloadazcopy</a>
$AzCopyDir = "C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy"

$resourcegroupname = "LaunchDB-bacpac-RG"
$StorageAccount = "bacpacrepository"
$ContainerName = "bacpacs"
$StorageURI = "https://$StorageAccount.blob.core.windows.net/$ContainerName/"

$StorageKey = "mykey"

Set-Location $AzCopyDir
.\AzCopy.exe /Source:$OutputDir /Dest:$StorageURI /DestKey:$StorageKey /Pattern:$OutputFileName

AzureRM

The last step in this process is to import the bacpac into the logical SQL Server which was previously created. Azure is designed for PowerShell management and the AzureRM PowerShell module has a lot of convenient cmdlets to use.

The below script will:

  1. Import the bacpac file to a Standard S3 Azure SQL Database.
    1. Standard S3 is being used to improve the performance of the import process.
  2. Monitor the progress in a loop.
  3. Downgrade the price tier to Basic.
    1. Basic is the long-term intended price tier for this particular application.

NOTE: In this part of the script the Get-AzureRmStorageAccountKey cmdlet is used so that the real account key does not need to be stored in the script in plain text. This same method can be used above with the AzCopy step but the Login-AzureRmAccount cmdlet will need to precede the AzCopy step then. This method is shown in the full script download linked below.

# Import bacpac into an Azure SQL Database

# Requires the AzureRM module
# <a href="https://docs.microsoft.com/en-us/powershell/azure/install-azurerm-ps?view=azurermps-4.1.0">https://docs.microsoft.com/en-us/powershell/azure/install-azurerm-ps?view=azurermps-4.1.0</a>

Import-Module AzureRM

# This pops open a GUI for login.
# Making the login process silent and unattended is out of scope for this article.
Login-AzureRmAccount

$location = "East US"
$adminlogin = "SQLHammer"
$password = "superSECRETpassword"
$servername = "launchdb-bacpac"

# Import bacpac to database with an S3 performance level
$importRequest = New-AzureRmSqlDatabaseImport -ResourceGroupName $resourcegroupname `
-ServerName $servername `
-DatabaseName $Database `
-DatabaseMaxSizeBytes "262144000" `
-StorageKeyType "StorageAccessKey" `
-StorageKey $(Get-AzureRmStorageAccountKey -ResourceGroupName $resourcegroupname -StorageAccountName $StorageAccount).Value[0] `
-StorageUri "$StorageURI$OutputFileName" `
-Edition "Standard" `
-ServiceObjectiveName "S3" `
-AdministratorLogin $adminlogin `
-AdministratorLoginPassword $(ConvertTo-SecureString -String $password -AsPlainText -Force)

# Check import status and wait for the import to complete
$importStatus = Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink
Write-Host "Importing" -NoNewline
while ($importStatus.Status -eq "InProgress")
{
$importStatus = Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink
Write-Host "." -NoNewline
Start-Sleep -s 10
}
Write-Host ""
$importStatus

# Scale down to Basic after import is complete
Set-AzureRmSqlDatabase -ResourceGroupName $resourcegroupname `
-ServerName $servername `
-DatabaseName $Database  `
-Edition "Basic" `
-RequestedServiceObjectiveName "Basic"

Download full script here.

Take aways

Covering the spectrum from command line interface to graphical user interface, there are multiple methods of exporting and importing bacpac files into an Azure SQL Database. The bacpac file method is simple and convenient but assumes that the database can be exported to an on-premises disk before being imported. Also, it assumes that the database can be quiesced or else you will have data loss when cutting over to the Azure SQL Database.

The dacpac method with ETL solves the problem of exporting large quantities of data to on-premises shares and the transactional replication method can solve the downtime issue. Look into those methods next to understand all the options available.

  • Export and deploy a dacpac file (schema only) and then ETL your data.
  • Transactional Replication.

Leave a Reply