Encrypting SQL Server Backups

If a database contains sensitive information, it is possible to encrypt any Full, Differential and Transaction Log backups that are taken of it.  This functionality is included with SQL Server Standard Edition and is not just an Enterprise feature.

There are no requirements to enable any features in SQL Server, just simply create a Master Key and a Certificate within SQL Server and then specify that the certificate is used when a database backup is taken.

Create Master Key

USE MASTER
GO
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'StrongPassword!!2345'

Create the Certificate

USE MASTER
GO
CREATE CERTIFICATE Backup_Cert_2020 WITH SUBJECT = 'Certificate for encrypting backups'

Use the Certificate to Backup the Database with Encryption

BACKUP DATABASE ProdDB01 TO DISK = 'C:\Backups\ProdDB01.bak'
WITH ENCRYPTION(ALGORITHM = AES_256, SERVER CERTIFICATE = Backup_Cert_2020)

It is also possible to encrypt any backups in the same way when the backup process is carried out within SQL Server Management Studio Backup GUI.

Backup the Certificate

The main issue with encrypting backup is that without the certificate that was used in the database backup, it is impossible to restore the backup. Therefore, the certificate needs to be backed up and stored safely off the server.  Using a password manager like Keepass is a good option as this allows you to add attachments. So it’s possible to store the passwords and the keys in the same place.

USE MASTER 
GO
BACKUP CERTIFICATE Backup_Cert_2020 
TO FILE = 'C:\Temp\Backup_Cert_2020.cer'
WITH PRIVATE KEY (
FILE='C:\Temp\Backup_Cert_2020.ppk',
ENCRYPTION BY PASSWORD ='Another strong password!!-23')

The above command creates two files in the C:\Temp directory of the database server and it also password protects the private key.

To restore the certificate on another server you need both the files and the password that was used to encrypt the private key.

Optionally, once you have securely saved the certificate files and password , remove the files from the database server.

Strictly speaking the password the service master key is not required in normal operations, but it is recommended that you securely store the password that you encrypted it with. If you restore the master database on another server you need this password to regenerate the master key.

Restore the Certificate

To restore the encrypted backup to another server, you first need to restore the Certificate that was used to encrypt the backup on to the new server.

As before, the first step in this process is to create a master key on the new server. This doesn’t have to use the same password as the original master key, it just needs to exist.

USE MASTER
GO
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'StrongPassword!6789'

Then create the certificate using the two backup files and the password that was used to protect the private key.

USE MASTER
GO
CREATE CERTIFICATE Backup_Cert_2020 FROM FILE ='C:\Temp\Backup_Cert_2020.cer'
WITH PRIVATE KEY(FILE='C:\Temp\Backup_Cert_2020.ppk',
DECRYPTION BY PASSWORD='Another strong password!!-23')

Normally add this information into Keepass along with the certificates.

Restore the Database

To restore the database there is no need to specify any arguments about using the certificate, just use the standard restore command.

RESTORE DATABASE ProdDB01 FROM DISK = 'C:\Restores\ProdDB01.bak'

When the backup is restored, SQL Server will look at the header of the backup file and this has the ThumbPrint of the certificate that was used to encrypt the backup.

The above ThumbPrint is then checked against the sys.certificates table to see if a Certificate exists that can decrypt the backup file. If not matched, then the restore cannot be carried out.

Restoring a database without having the encryption certificate on the server results in the error Cannot find server certificate with thumbprint

Certificate Expiry

When a certificate is created it has a default lifetime of 1 year.

It’s important to note that once the expiry date has been reached, any backups that are taken using the certificate will fail. This is not fail to encrypt, it will actually fail to carry out the database backup.

However, the expiry date doesn’t apply to restores. Backups can be restored after the expiry date of the certificate has been reached.

To specify an expiry date that is longer than one year, simply use the EXPIRY_DATE option with a date format in the following format.

USE MASTER
GO
CREATE CERTIFICATE Backup_Cert_2020 WITH SUBJECT = 'Certificate for encrypting backups', EXPIRY_DATE = '20240101'

There is also a START_DATE option, and this just defaults to the current time if it is not specified.


U4BW (Agresso) Compress and Archive Logs and Reports over a certain age

In a recent article we covered how to remove old Server Logging and Report Results files.

https://intersect.co.uk/agresso-business-world/agresso-clean-up-of-blobs/

And we also did an article on deleting the files from the U4BW Application Server when they are over a particular number of days old.

https://intersect.co.uk/agresso-business-world/agresso-delete-blobs/

This article also shows how to use Powershell to clean up files over a certain number of days old, but instead of deleting the files it compresses them to ZIP files.  The log files which were created in January 2018 will be compressed to a single ZIP file called Logs-2018-1. February 2018 will go to Logs-2018-2 and so on.  Report Result files will be on the same basis but rather than a Logs prefix will have a prefix of Report – Report-2018-01.

If you are thinking of archiving all of the files from the U4BW Application server before you purge them from the database, then you may find that you will have lost some older files.  The reason behind this is that U4BW software does not get upgraded.  So if you upgrade from Milestone 5 to Milestone 7 for example, then the Milestone 7 installation will be a new install and only contain Server Logging and Report Results files from Milestone 7.  A lot of organisations may have upgraded several times and each time used a new U4BW Application Server, so the only complete set of files would be in the database.

The script uses WinRar to compress the files.  It is possible to use other compression packages but you will have to change the executable name and syntax for the two lines that do the compression.

WinRar does not have to be installed on the U4BW Application Server, you can simply copy the WINRAR.EXE file from another installation and put it into a directory on the server.  If you do this, check that the $rar variable is correct in the script.

Once the files have been archived WinRar deletes the original files and moves them in to the Recycle Bin.

Compressing files is CPU intensive.  This isn’t an issue, but it is better to carry out this routine outside of normal business hours as it could effect performance.

Please note, the following code is not supported and should be verified in a non-production environment before running in production.

# Edit these variables
#===================================================

	# Get the files which should be moved, without folders
	$ServerLogging = 'C:Program FilesUNIT4 Business World On! (v7)Data Filesabwt05Server Logging' 
	$ReportResults = 'C:Program FilesUNIT4 Business World On! (v7)Data Filesabwt05Report Results'
	
	# Target Filder where files should be moved to. 
	$targetPath = 'C:Archive'

	#How many days do you want to keep	
	$Daysback = "10" 	

	#Location of rar.exe program
	$rar = 'C:Program FilesWinRARrar.exe'

#===================================================	
# End of editable variables

	$CurrentDate = Get-Date
	$DatetoArchive = $CurrentDate.AddDays(-$Daysback)	

	# Process Server Logging
	$dir = Get-ChildItem $ServerLogging | Where-Object {$_.LastWriteTime -lt $DatetoArchive} 

	foreach ($file in $dir)
	{$year = $file.LastWriteTime.Year.ToString()
	$month = $file.LastWriteTime.Month.ToString()
	$file = $file.FullName

	# Set Directory Path
	$Directory = $targetPath 
	if (!(Test-Path $Directory))
	{
	New-Item $directory -type directory
	}

	#Set Archive Name
	$Archive = $Directory +"" +"Log-" +$year +"-" +$month

	# Add file to zip archive
	&$rar a -dr $archive $file 
	}

	# Allow system to catch up for 1 minute
	start-sleep 60

	#Process Report Results
	$dir = Get-ChildItem $ReportResults | Where-Object {$_.LastWriteTime -lt $DatetoArchive} 

	foreach ($file in $dir)
	{$year = $file.LastWriteTime.Year.ToString()
	$month = $file.LastWriteTime.Month.ToString()
	$file = $file.FullName

	# Set Directory Path
	$Directory = $targetPath 
	if (!(Test-Path $Directory))
	{
	New-Item $directory -type directory
	}

	#Set Archive Name
	$Archive = $Directory +"" +'Report-' +$year +"-" +$month

	# Add file to zip archive
	&$rar a -dr $archive $file 
	}

 


U4BW (Agresso) Delete Logs and Reports over a certain age

In a recent article we covered how to remove old Server Logging and Report Results files.

https://intersect.co.uk/agresso-business-world/agresso-clean-up-of-blobs/

Some of the feedback that we get around this subject is that there can be an uncertainty around whether all of the files should be deleted from both the database and the U4BW Application Server.

However, having several hundred thousand files in the U4BW Server Logging directory can be an issue when you try to look at any of the files, as File Explorer can take considerable time to show and sort the files.

As an alternative to using the in built U4BW options, it may be a case of leaving the files in the database, but removing them from the Application Server.  Rather than a global delete, it may be preferable to remove files that over a certain number of days old – 30 days for example.

Although it is possible to do this manually it is easier to do it with a Powershell command which can then be scheduled to run once a month.

Before running the command below you will need to edit three variables at the top of the script as tell the script which directory the files are located in and how many days you want to keep.

Please note, the following code is not supported and should be verified in a non-production environment before running in production.

# Edit these variables
#===================================================

#Location of the Server Logging Directory
$ServerLogging = "C:Program FilesUNIT4 Business World On! (v7)Data Filesabwt05Server Logging"

#Location of the Report Results Directory
$ReportResults = "C:Program FilesUNIT4 Business World On! (v7)Data Filesabwt05Report Results"

#How many days do you want to keep
$Daysback = "30"

#===================================================
# End of editable variables

$CurrentDate = Get-Date
$DatetoDelete = $CurrentDate.AddDays(-$Daysback)

$ServerLoggingBefore = (get-childitem $ServerLogging).Count
$ReportResultsBefore = (get-childitem $ReportResults).Count

$now = Get-Date

"Starting to delete Server Logging " + $now

Get-ChildItem $ServerLogging | Where-Object { $_.LastWriteTime -lt $DatetoDelete } | Remove-Item

"Allowing system to catch up for 1 minute..."
Start-Sleep 60

$now = Get-Date

"Starting to delete Report Results " + $now

Get-ChildItem $ReportResults | Where-Object { $_.LastWriteTime -lt $DatetoDelete } | Remove-Item

$ServerLoggingAfter = (get-childitem $ServerLogging).Count
$ReportResultsAfter = (get-childitem $ReportResults).Count

$ServerLoggingDeleted = $ServerLoggingBefore - $ServerLoggingAfter
$ReportResultsDeleted = $ReportResultsBefore - $ReportResultsAfter

$ServerLogging
"Files before delete = " + $ServerLoggingBefore
"Files after delete = " + $ServerLoggingAfter
"Number of files deleted = " + $ServerLoggingDeleted

$ReportResults
"Files before delete = " + $ReportResultsBefore
"Files after delete = " + $ReportResultsAfter
"Number of files deleted = " + $ReportResultsDeleted

 


U4BW (Agresso) Clean up of Server Logging & Report Results Files

Each time a report is run within Agresso it produces a log file and a report results file.  These files are stored on the Agresso Application server in the “Server Logging” folder and “Report Results” folder respectively.  They are also stored in the database in a table called ACRPRINTBLOB.  When a report or log file is viewed from “Maintenance of ordered reports” within the Agresso desktop client, the files are retrieved from the database.

Naturally the default setting within Agresso is to maintain these files on a permanent basis.  There are however very good routines within Agresso that will allow you to clear them down.  Please note however, before applying any of the following clean up routines, some organisations do want to keep these indefinitely and this may be for auditing purposes for example. However when auditing purposes as been cited for a reason for keeping reports, it is usually only a very small number of report types that fall into this category.

There are two methods within Agresso for implementing the clean up routines.  The first one is configured within the Agresso Management Console and allows you to specify that you would like to keep the file for a certain number of days.  Or alternatively that you would like to keep a particular number of each type of report.  These settings apply to both report result files and server logging files.

Agresso Blob

 

The  above settings keep all report result and server logging files for 15 days.  The settings will also delete the files from the Agresso Application server which are in “Server Logging” and “Report Results”.

In the above screenshot, the “Mode” option which shows “Number of days” is  a drop down list.  As discussed, the other option available is to keep “Number of Report Orders”.

If you find you do have 10’s of Gigabytes of data in the ACRPRINTBLOB table, then it may be worth setting the value to a higher figure than 15 days and gradually reducing it down to the value you want.  This will just avoid a DELETE statement removing a huge amount of data in a single transaction which could cause the transaction log to grow a lot.

If you are in a situation where you would like to keep one or two specific types of report results files because they are perhaps a point in time snapshot of values.  Then it is possible to remove files on a more granular level.  This cannot be done in the above setup though, you need to use the AG57 report which is run from within the Agresso Desktop Client. 

This report accepts parameters for types of particular reports or specific reports.  So if you have scheduled a PO01 report to run every 5 minutes and would like to clear down all of the PO01 reports, then the AG57 can be setup and scheduled to do this.  Or you could just remove all of the PO reports.

The Ag57 allows a much more granular approach to removing unwanted files, but will take longer to setup than the global approach that is done within the AMC as obviously you would need to schedule one report for each set of report modules that you wanted to remove.

A common question that we are asked is that if the ACRPRINTBLOB table is over 50% the size of the total database.  Will clearing it down make the Agresso Client run twice as fast?  Afraid the answer to that would be No. It would however dramatically reduce the size of your database backups and this is particularly true when you are using SQL Server compressed backups.  The Image files are already compressed when they are stored in the database, so the backup compression has minimal effect on them.

The other advantage of clearing down unwanted report result and log files is that finding log files in the “Server Logging” folder is now manageable.  Faster backup and restore times of the database and reduced backup sizes also provide many advantages both in terms of cost and manageability.

If you are experiencing problems with having massive amounts of files in the “Server Logging” and “Report Results “directories, but are unable to implement either of the above two procedures.  Then as a workaround we would recommend scheduling a PowerShell script to archive the files from the directories and move them into a ZIP or RAR file.  We have implemented this before where once a month the job would run and move files older than a month to a compressed file.  So on the 1st of June the job would move all of the files with a timestamp of April into a single compressed file and so on.

For further information on using Powershell to delete or archive files…

https://intersect.co.uk/agresso-business-world/agresso-delete-blobs/

https://intersect.co.uk/agresso-business-world/agresso-archive-blobs/


Agresso support for SQL Server 2017

UNIT4 Business World 2017 (Milestone 7 ) Update 02 for has been released and provides support for Microsoft SQL Server 2017. Just to confirm for anyone thinking of upgrading to SQL Server 2017, SQL Server 2017 (14.x) supports upgrades from the following versions of SQL Server:

  • SQL Server 2008 SP4 or later
  • SQL Server 2008 R2 SP3 or later
  • SQL Server 2012 SP2 or later
  • SQL Server 2014 or later
  • SQL Server 2016 or later

Along with this, there are numerous other fixes and features in update 02.  Noticeable to us, is the fix to the apovitransdetail view.  The view definition was changed in Milestone 7 and had a performance impact for customers with large amounts of Purchase Order data.  We did have a fix for this prior to this release, but good to see it in this update.

For futher information

https://abwupdates.agresso.com