How can we help you today? How can we help you today?
joeWI

Activity overview

Latest activity by joeWI

Thanks for the quick replys. Brian: Per your suggestion, I ran the script inside Data Compare. It runs in a little over 1 minute. (Show off! [image] ) We do have a database utility that backups, applies upgrades, etc. When it was running so long, I used Query Analyzer to see whether our utility was the performance bottleneck or SQL Server itself. Our work around is to: Read the data script line by line. When we have accumulated 10 lines, we start looking for a logical place to break the script and submit what's been accumulated to SQL Server. Thus feeding it small pieces of the data script. Because a SQL statement can be spread across multiple lines, we could not use a simple line count. (We have data that contains carriage returns. :roll:) So once our threshold is met (i.e. 10 lines), as soon as we hit a line that begins with "INSERT INTO [dbo].[", "DELETE FROM [dbo].[" or "ALTER TABLE [dbo].[", we break and submit everything that has been previously accumulated to SQL Server. (We ignored update statements as those are rarer.) So far, it seems to be working. The script running through our utility takes about a minute as well. I can't think of a scenario where this rule would fail in a data script. James: Thanks for the suggestion. I had already tried that without any significant difference. / comments
Thanks for the quick replys. Brian: Per your suggestion, I ran the script inside Data Compare. It runs in a little over 1 minute. (Show off! ) We do have a database utility that backups, applies...
0 votes
Large data differences produce a long running script
I have two databases that have approximately 115,000 data differences (adds, updates and deletes) that are spread across 300 tables. If I use SQL Data Compare, the data script contains one SQL batc...
3 followers 5 comments 0 votes