Comments
Sort by recent activity
OKAY.
What is the size of the database and the definition of the log?
What release of SQL Server are you running?
If SQL2005 or higher just right click on the database and under Properties/Files you will see the actual sizes and the autogrow if you are letting it expand.
How often do you run the full and the log backups?
Then I can help you from there.
Chris / comments
OKAY.
What is the size of the database and the definition of the log?
What release of SQL Server are you running?
If SQL2005 or higher just right click on the database and under Properties/Files yo...
Thanks.
Our normal rule of thumb is to make the log file, at initial creation, 50% of the data. Based on that you are way off here as the data is over 100 times bigger.
Also I would suggest changing the increment from % to a fixed size in MB because 10% of 100 is 10 and then it becomes 10% of 110 and so on.
Based on your log output I would make the log close to 400Mb and grow in 50Mb chunks.
If I understand your info you run the log backup every two hours. If this is correct then why not add an alert to fire the log backup if the log fills to over say 65% (what we use).
HTH
Chris / comments
Thanks.
Our normal rule of thumb is to make the log file, at initial creation, 50% of the data. Based on that you are way off here as the data is over 100 times bigger.
Also I would suggest changin...
If your database log is getting to large then you need to do at least 2 things.
One is too run your log backup more often, as this is the only thing that will take data out of the log.
Two would be to better size your log. It must be too small to hold the data from the transactions that get applied before you run your log backup. We run regular log backups as well as have a percent full alert that triggers the log backup.
Chris / comments
If your database log is getting to large then you need to do at least 2 things.
One is too run your log backup more often, as this is the only thing that will take data out of the log.
Two would be...
Hi,
I am using 6.3.0.48 on a 800Gb SQL2000 DB and only get small differentials unless I have re-indexed some of the tables. I don't remember the difference between 6.2 and 6.3 or 6.4 that would have fixed your problem.
You didn't mention what level of SQL you were using and if its SQL2008 is the database compressed?
Chris / comments
Hi,
I am using 6.3.0.48 on a 800Gb SQL2000 DB and only get small differentials unless I have re-indexed some of the tables. I don't remember the difference between 6.2 and 6.3 or 6.4 that would hav...
No this DB is in the Simple Recovery Model. It's Microsoft MOM reporting respository DB.
What build are you running of SQL2005? It has been updated many times with CU's.
Chris / comments
No this DB is in the Simple Recovery Model. It's Microsoft MOM reporting respository DB.
What build are you running of SQL2005? It has been updated many times with CU's.
Chris
Now that you have a way of recovering data from the log just before the full is run, have you looked to see how big your transaction log files are compared to the data in your databases? Are your logs large compared to the actual amount of data and do your log backups take a long time?
If you have individual log backup jobs rather than one that backs up all your database transaction logs at the same time, you might want to add percent log full alerts to fire off extra log backups. If you have just the one job then you might want to run it more aften than once a day for recovery benefits as well as database size savings. If you have large logs you could have lots of VLF's (variable log fragments) that hinder recovery.
Chris / comments
Now that you have a way of recovering data from the log just before the full is run, have you looked to see how big your transaction log files are compared to the data in your databases? Are your l...
Why do you take so few backups if you want point in time recovery?
You need to create a better backup strategy. If your database is very large then you can take differential backups during the week. If you have vert little in the way of updates then taking more log backups will not hurt as they would be small and take little time.
Chris / comments
Why do you take so few backups if you want point in time recovery?
You need to create a better backup strategy. If your database is very large then you can take differential backups during the week...
If your Databases and disks are okay you can always take a special log backup, after your normal one and before your full backup. Now if you have corruption you may not be so lucky. Maybe you should schedule a log backup just before your full backup so if it runs you can use that for point of time recovery.
Again I ask why so few backups if you are expecting to restore to a point in time? Even though you may have many small DB's you run the risk of losing many hours/days of data if something goes wrong. Are you prepared for that? Also if you have some large updates then your logs are probably sized much larger than they need to be as you are rarely offloading them. If you are on SQL2005 and beyond only a log backup will offload the committed transactions. Sounds like you should have tuned in to Paul Randals backup discussion on PASS 2 days ago.
Our scheme is a full backup every day and hourly log backups and alerts to fire a log backup if the log gets fuller than 65% before the scheduled log backup runs.
Chris / comments
If your Databases and disks are okay you can always take a special log backup, after your normal one and before your full backup. Now if you have corruption you may not be so lucky. Maybe you shoul...
We don't run our jobs that why. We run EXECUTE master..sqlbackup N'-SQL "BACKUP DATABASE and a whole lot of parameters. I have never used sqbutility but from your output you are definitely not looking to delete any old backups either from the disk or from msdb. You might have a very swollen msdb if you aren't getting the jobhistory entries cleaned up too.
Myabe someone from Red Gate will tell you how to code sqbutilty to make it delete old files.
Chris / comments
We don't run our jobs that why. We run EXECUTE master..sqlbackup N'-SQL "BACKUP DATABASE and a whole lot of parameters. I have never used sqbutility but from your output you are definitely not look...
As you are running 6.4 you should be able to opem up the GUI on the server and right click on the log and Show Log. You should then see the parms used followed by the summary of the backup followed by any attempted deletes.
HTH
Chris / comments
As you are running 6.4 you should be able to opem up the GUI on the server and right click on the log and Show Log. You should then see the parms used followed by the summary of the backup followed...