As a small business, you might rely on a small business application or custom software to run your business. As part of your DR process, this data should be backed up frequently and you should be sending copies offsite in the event of a disaster.
What if I told you that we can automate this process by sending the backups to Microsoft's Azure cloud storage service?
In the first post in this series, Sofia looked at setting up and automating sending backups to Amazon's AWS cloud storage services. In Part 2, I tell you how I would automate a process to send backups to Microsoft's Azure Blob Storage as well as providing a breakdown of ongoing estimated costs.
Prerequisites
- An Azure subscription
- The ability to set up and run PoweShell scripts and to have installed Azure Az PowerShell
- The ability to run Task Scheduler
Set up
For the purposes of this demonstration, I've downloaded a copy of Microsoft's SQL Server Express, restored a copy of the OLTP AdventureWorks2019 database backup from Microsoft and set up a Task Scheduler to take a full backup of my database once a day and store the backup in a folder on my machine in C:\Uploads.
I've also signed up to Azure and have set up a new storage account called backupdemosqlexpress with a new container called backups. Here's what the storage account looks like:
And here's the container where we are going to land our backup files:
Now for the fun part. This is my PowerShell script to compress my backups and send them to Azure. This loops through C:\Uploads looking for any file ending in .bak. It then zips the file and moves the original .bak to C:\Uploads\raw. Finally the zipped file is pushed to my storage container:
$path = 'C:\Uploads'
$raw_path = $path + '\Raw'
Get-ChildItem $path -Filter *.bak |
Foreach-Object {
$file_name = $_
$zipped_filename = $file_name.FullName + '.zip'
$latest_filename_fullpath = $path + '\' + $file_name
Write-Host $(Get-Date -format 'u'): Compressing file $latest_filename_fullpath
$compress = @{
LiteralPath= $latest_filename_fullpath
CompressionLevel = "Fastest"
DestinationPath = $latest_filename_fullpath + '.zip'
}
Compress-Archive @compress
$destination_file_path = $raw_path + '\' + $file_name
Write-Host $(Get-Date -format 'u'): Moving file $latest_filename_fullpath to $destination_file_path
Move-Item –Path $latest_filename_fullpath -Destination $destination_file_path
Write-Host $(Get-Date -format 'u'): Sending file to blob $zipped_filename
Connect-AzAccount
$storage_account = Get-AzStorageAccount -ResourceGroupName "backup-demo" -Name "backupdemosqlexpress"
$container_name = 'backups'
$context = $storage_account.Context
$Blob = @{
File = $zipped_filename
Container = $container_name
Blob = $zipped_filename
Context = $context
StandardBlobTier = 'Hot'
}
Set-AzStorageBlobContent @Blob
}
Here's the script action:
And here are my backups landed in Azure:
Similar to Sofia, I've set up a daily job via Task Scheduler to run my PowerShell script to do this automatically for me.
Cost estimate
Microsoft divides their storage pricing model into three tiers:
hot - suitable for data requiring frequent access (most expensive tier)
cold - suitable for data requiring infrequent access
archive - suitable for data requiring rare access (cheapest tier)
To make a direct pricing comparison to Sofia, I've made the assumption that we would be pushing a 10 GB backup file to the cloud each day. Assuming we push our backups to hot storage, move them to cold after 7 days and archive 30 days after that and keep them for a total of 7 years, we would be looking at a price breakdown as follows:
Results
As you can see, the cost of maintaining daily 10 GB files in Azure is around $1,150 - almost double the price that AWS was. If you're looking at mimising cost while automating your backups AWS wins in this instance.