Cathy and I thought we would do something a little different this time around, two blog posts about the same problem with two solutions. These days you can easily get the impression that most companies use SaaS (Software as a Service) rather than on premise applications with a database back-end also on premise. Companies deal with this differently, some have mature backup solutions, however, some simply backup the database and put it on something akin to a memory stick and take it home.
It happens, and we can understand why. These two blog posts attempt to show how we can easily and with a limited budget implement better backup processes. Cathy post will show how we can do this using Microsoft cloud, Azure whereas this post will use Amazons cloud, AWS.
AWS Setup
We will start here, putting your backups with a cloud provider means you need ot make sure the data is secure. I recommend configuring AWS Control Tower so that your S3 buckets have some guardrails - we will not go into configuring this in this blog post - reading material on AWS Control Tower
Once the Control Tower has been configured and deployed go ahead and create your S3 bucket, making sure that public access is blocked.
To access the bucket to put our backups here a new accounts needs to be created. To create an IAM user with access to Amazon S3, you first need to login to your AWS IAM console. Type iam console in the search bar to navigate easily.
Under the Access management group, click on Users.
Next, click on Add user.
Type in a user name and tick the Access Key check as the backup file will be moved via script.
Set the permissions, we have selected AmazonS3FullAccess
On the next page you will be presented with an Access Key and a Secret Key access. Note these in your password safe as these will be used in the next step.
Install AWS cli
AWS have some great documentation for installing the cli, I find the simplest and quickest to be running the below in a cmd prompt
msiexec.exe /i https://awscli.amazonaws.com/AWSCLIV2.msi
Once installed run the below to configure access using the keys generated when configuring the IAM user.
aws configure
Now confirm access and setup by running this command to list the available s3 buckets
Aws s3 ls
Backing up our database
As we do not have an agent available when running SQL Server Express we will use Powershell to backup the database. In our example we will use the AdventureWorks database as a sample
$instance = "FREYA\SQLEXPRESS" #This is your computer name and the name of the instance
$db = "AdventureWorks2019" # This is your database name
$datestring = (Get-Date -Format 'yyyyMMddHHmm')
$backupfile = "C:\SQLBackup\$db-$datestring.bak" #This is where you want your backups to go
Backup-SqlDatabase -ServerInstance $instance -Database $db -BackupFile $backupfile
This very simple script will backup the database selected to the specified location. As SQL Server Express does not include backup compression we will add this in as we want to make the files small and portable for when we move these to the cloud. Helpfully PowerShell has this in hand with the Compress-Archive command
$instance = "FREYA\SQLEXPRESS" #This is your computer name and the name of the instance
$db = "AdventureWorks2019" # This is your database name
$datestring = (Get-Date -Format 'yyyyMMddHHmm')
$backupfile = "C:\SQLBackup\$db-$datestring.bak" #This is where you want your backups to go
$zipfile = "C:\Users\User\SQLBackup\$db-$datestring.zip"
Backup-SqlDatabase -ServerInstance $instance -Database $db -BackupFile $backupfile
Compress-Archive -Path $backupfile -CompressionLevel Optimal -DestinationPath $zipfile
Now we have a backup and a compress, we are ready to move the file, go ahead and add a folder in your S3 bucket if you want some structure, I have added one called backups.
Aws s3 cp $zipfile s3://mycloudbackupbucket/backups/
Save the script as a ps1 file, we will assume the below location
C:\SQLBackupScripts\backupSQLtoAWS.ps1
Scheduling the task
Search up task scheduler on the machine
This is where we can setup programs and scripts to run automatically, lets go through the process.
Select Create a basic task on the right hand menu.
Name the task something relevant
Trigger is what will cause the task to run, here I have selected 9am, 7pm or something similar may be a good time for your business.
Now we want to run a script using PowerShell, for this we select "Start a program"
The program should be the location of your executable for Powershell. Example:
%SystemRoot%\syswow64\WindowsPowerShell\v1.0\powershell.exe
The argument box should include the following
-NoLogo -NonInteractive -ExecutionPolicy Bypass -File "C:\SQLBackupScripts\backupSQLtoAWS.ps1"
Complete the wizard.
You should now see the task in the library, right click and click run to test the job.
This will pop up the Powershell window and run the script, once completed check the S3 bucket and confirm there is now a fresh .zip file with today's date.
Done! You now have an automated backup process taking an express backup and putting it in cloud storage.
This process does not have any alerting so it is critical that there is a process to check that the backups have been successfully taken and moved to cloud. It also doe snot clean up old files from on premise which can be done with another PowerShell script.
Costs
It is difficult to estimate what the costs will be so we have assumed the storage is S3 standard and that the backup size will be 10GB a day. 10GB a day is unlikely as we are applying compression and it is the maximum size an Express database can grow to.
Making a number of assumptions, like no price increases, 10GB backup size each day etc. the cost for storing 7 years of backups on S3 Standard would cost approximately $640 dollars a year.
We have not taken into account egress or ingress fees, due to the amount of data and frequency of transactions the cost would be minimal.
If you have any question on this post feel free to reach out. The next post will use Microsoft Azure storage for those who would rather store their backups in Azure.