Comments
6 comments
-
By the looks of the message, the backup was a success, but the copy to the network share failed. I suspect this happened because the network location was unavaliable for some reason at the time of the backup.
It could also be that the SQL Server doesn't have enough contiguous memory to use as buffer space for a backup over the network. Since SQL Server provides this memory space to SQL Backup, it's possible that on a heavily used server that has been up for a long time. The workaround is to manually specify the buffer size in the backup command. This parameter is called MAXDATABLOCK, and must be a multiple of 65536.
For instance, you could amend your backup command to read:
master..sqlbackup N'-SQL "BACKUP DATABASE [Database] TO DISK = ''\\backup-server\d$\FULL_Database.sqb†WITH NAME = ''Database (Database)'', DESCRIPTION = '' '', INIT, MAILTO = ''pgrayson@tpp-uk.com'', COMPRESSION = 1, MAXDATABLOCK=65536"'
If this doesn't work, you can also add MAXTRANSFERSIZE, which is similar to MAXDATABLOCK, but affects memory usage for both local and network backups. MAXTRANSFERSIZE can be used in conjunction with MAXDATABLOCK for handling low-memory situations. -
So does SQL Backup actually perform another backup when the COPYTO option is used?
We're already using the MAXTRANSFERSIZE = 65536 option.
The network share is available, it is constantly being used for transaction log backups as part of a seperate Log Shipping process.
Maybe in this instance the job should complete with a Warning, or some sort of failure.
Thanks
Phill -
SQL Backup does not create another backup when you use the COPYTO option. It just attempts to copy the file over to the specified folder(s).
In this case, it appears to be failing, raising warning code 141. The message 'The operation completed successfully' is wrong, as SQL Backup appears to have retrieved the wrong Windows error code.
Is the COPYTO option failing everytime, or just intermittently? Thanks. -
The copy was failing due to lack of space on the target system.
Maybe a feature for the future is to provide options for SQL Backup to clean up the "COPYTO" directory as well as the local directory.
Thanks for your help. -
There is such an option to delete old backup files in the COPYTO folder(s). Use the FILEOPTIONS parameter, with the value 1 e.g.
EXEC master..sqlbackup '-sql "BACKUP DATABASE ... WITH COPYTO = ..., ERASEFILES = 7, FILEOPTIONS = 1" '
This will erase all backup files for the selected instance and database older than 7 days, in the primary backup folder(s) and in the COPYTO folder(s).
See the topic 'The BACKUP Command' in the help file, under 'The SQL Backup Toolkit\Toolkit Syntax' for more details. -
Great, then how about a change to command then. Allow different file ages between primary and secondary servers.
EG: Only keep the last day on the primary but keep the last 5 days on secondary.
Add comment
Please sign in to leave a comment.
After the backup I received the following errror,
The file was not copied and the service accounts have full permissions to the share.