How can we help you today? How can we help you today?
rgribble

Activity overview

Latest activity by rgribble

High level data comparison
I am looking for a way to compare 2 copies of a database that are supposed to be "in sync" and get a high level Yes/No status against each table, whether it is identical in each database or not. In...
1 follower 1 comment 0 votes
a pop up or alert window that steals focus is probably too intrusive, given that there are serveral valid situations where this will happen as you point out Michelle But perhaps a pop up bar like the Interactive Help bar, or some other type of non focus stealing alert/notification could occur if any scripts had problems being read by the parser, and the user could click it to review the problems that occured It would also be good to include a severity level, eg from WARNING if some parts of the file couldnt be parsed, but some valid table/procedure creation code was found and loaded, versus ERROR if the entire file was ignored as nothing could be made sense of. If possible output from the parser could be shown too, indicating the problem... in the OPs example something like "Invalid SELECT list on line xyz" would immediately pinpoint the problem. A way to view or jump to the section in the script file would be helpful too [image] That way after running a compare, the user would see the error bar notification and be able to click it for more details if they want, then make the decision if the compare was good or not. In the above case, the ERROR and completely ignored file would be identified, corrected and then recompare the project. In a case where it was some database creation code, the user decides that doesnt bother them and continues with their work. / comments
a pop up or alert window that steals focus is probably too intrusive, given that there are serveral valid situations where this will happen as you point out Michelle But perhaps a pop up bar like t...
0 votes
Jumping in on an old post here but i thought i would let people know that indeed this is the problem, and occasionally you do find that the dt_* objects are not correctly tagged as system objects in a particular database. We had this happen a few times, and although we have never identified what actually causes it to happen, we have the following script which will fix up things so these objects are once again tagged as system exec sp_MS_marksystemobject 'dtproperties' exec sp_MS_marksystemobject 'dt_addtosourcecontrol' exec sp_MS_marksystemobject 'dt_addtosourcecontrol_u' exec sp_MS_marksystemobject 'dt_adduserobject' exec sp_MS_marksystemobject 'dt_adduserobject_vcs' exec sp_MS_marksystemobject 'dt_checkinobject' exec sp_MS_marksystemobject 'dt_checkinobject_u' exec sp_MS_marksystemobject 'dt_checkoutobject' exec sp_MS_marksystemobject 'dt_checkoutobject_u' exec sp_MS_marksystemobject 'dt_displayoaerror' exec sp_MS_marksystemobject 'dt_displayoaerror_u' exec sp_MS_marksystemobject 'dt_droppropertiesbyid' exec sp_MS_marksystemobject 'dt_dropuserobjectbyid' exec sp_MS_marksystemobject 'dt_generateansiname' exec sp_MS_marksystemobject 'dt_getobjwithprop' exec sp_MS_marksystemobject 'dt_getobjwithprop_u' exec sp_MS_marksystemobject 'dt_getpropertiesbyid' exec sp_MS_marksystemobject 'dt_getpropertiesbyid_u' exec sp_MS_marksystemobject 'dt_getpropertiesbyid_vcs' exec sp_MS_marksystemobject 'dt_getpropertiesbyid_vcs_u' exec sp_MS_marksystemobject 'dt_isundersourcecontrol' exec sp_MS_marksystemobject 'dt_isundersourcecontrol_u' exec sp_MS_marksystemobject 'dt_removefromsourcecontrol' exec sp_MS_marksystemobject 'dt_setpropertybyid' exec sp_MS_marksystemobject 'dt_setpropertybyid_u' exec sp_MS_marksystemobject 'dt_validateloginparams' exec sp_MS_marksystemobject 'dt_validateloginparams_u' exec sp_MS_marksystemobject 'dt_vcsenabled' exec sp_MS_marksystemobject 'dt_verstamp006' exec sp_MS_marksystemobject 'dt_verstamp007' exec sp_MS_marksystemobject 'dt_whocheckedout' exec sp_MS_marksystemobject 'dt_whocheckedout_u' Hope it helps / comments
Jumping in on an old post here but i thought i would let people know that indeed this is the problem, and occasionally you do find that the dt_* objects are not correctly tagged as system objects i...
0 votes
I too currently have drop scripts in all of our source controlled SQL scripts, mainly because my homegrown build process runs all of these files against empty staging databases before using RedGate to snapshot the schema, and eventually roll the schema changes to target test/production servers. The reason for the drop scripts is because my simple build process uses a brute force approach where i run a script, and if it fails add it to the end of the list, and rerun it again later... the drops ensure that the scripts dont fail if an item creation succeeded but an alter statement (primary key, permission etc) was the reason for the initial failure I have recently enhanced my build process to use RedGate SQLCompare (driving the toolkit through c#) to load up the script folder on the left side, comparing with the empty staging DB on the right side, and synchronise all changes into the staging DB. In this way, i can "build" the database schema, and no longer need to worry about what order to do it in etc From doing this process, i have noticed that RedGate is actually unphased by the DROP statements in the scripts, they still are loaded up fine and RedGate understands the schema objects that should be present. So one suggestion to allow you to do what you want to (check up on developers), without having to wait for a possible future SQL Compare that can generate drop statements, is to simply do a RedGate comparison of the script folder from source control (complete with DROP statements in them) against the development database, which will identify any changes in the DB that the developers have failed to script out and check in. This doesnt help with generating scripts if you do want drops in them, but it does help if you want to check up on the development database to try and catch uncommitted changes. Incidentally you could actually do away with the drop statements in scripts, by having developers use SQL Compare to compare the scripts from source control to their development database, and then synchronise changes into the live DB, thus getting "the latest" version of a table (just as they currently do by running the scripts directly) with the benefit of not losing their test data as a DROP/CREATE would do, because RedGate will do an ALTER where possible However, I still add my support to this request, to have RedGate (optionally) add drop statements to scripts, for those that want/need it... it would still be a useful feature so I hope they do implement it / comments
I too currently have drop scripts in all of our source controlled SQL scripts, mainly because my homegrown build process runs all of these files against empty staging databases before using RedGate...
0 votes
3.5 RC Unhandled Exception
I opened Query Analyzer and connected to a server. I had set this server's "master" database in the "Connection to Ignore" under SQLPrompt Options Once Query Analyzer was up (and still pointing at ...
1 follower 1 comment 0 votes
Script folder only available in Pro edition
Over the last few months while many of us were Beta testing SQL Compare 6 there was never a mention of the fact that in the final verison, the script folder source is only available in the Pro edit...
1 follower 1 comment 0 votes
I think there may be some confusion over terms here... i have noticed alot of people say "check out" when they arent actually checking out a file to edit it... they really should say "synch" or "get". The idea of using SQL Compare 6's new script folder abilities to "build" a database is one i am looking forward to as well... like Granted, i have an inhouse build routine which essentially takes all of the database scripts and "builds" them into an empty staging database, that can then be snapshotted and/or used as a source to synchronise schema changes to a live DB. Currently my DB build is pretty dumb... it just uses a queued approach and hammers scripts at the staging database, if one fails it gets requeued at the end and tried later... It works for the most part but is ugly and inefficient. But i didnt want to get into the nitty gritty of actually interpretting the script files and figuring out some kind of dependancy order. I plan to change my build routine to create the empty database, then use SQL Compare 6 to compare the script folder source, to the empty database, and synchronise all changes across. I can let redgate worry about the dependancies and order that the objects need to be created! [image] We use the SQL Compare Toolkit edition (c# libraries to use in our .NET apps) to do this kind of thing here, but i believe that the "Pro" edition of SQLCompare supports commandline invocations as well, so that may be what you can do Granted. I agree with David, in terms of building a database from the scripts, no integration with source control on the part of RedGate itself is strictly necessary... most of us would probably have our existing build scripts or visual build applications like Visual Build Pro, FinalBuilder etc, anyway, and these can handle synching to the desired label/revisions of scripts and creating the empty database, before invoking redgate to do the actual "build". Then assumedly there are cleanup steps afterwards that our build script will do too. In my case this is comparing the built database to the snapshot stored in perforce, and if this is different checking out the snapshot and generating a new one ready to check back in. These tasks are accomplished by an application i wrote, which uses the SQLToolKit libraries / comments
I think there may be some confusion over terms here... i have noticed alot of people say "check out" when they arent actually checking out a file to edit it... they really should say "synch" or "g...
0 votes