Comments
Sort by recent activity
Fair point, i suppose my instructions for option 1 should have read: UNLINK/RE-LINK STATIC DATA 1. Unlink static data 2. Deploy new column as NULLABLE all the way up to prod ****AND TO EVERY DEV WORKSTATION**** 3. INSERT static data manually all the way to prod ****AND TO EVERY DEV WORKSTATION****, or relink static data to source control and then deploy all the way to prod ****AND TO EVERY DEV WORKSTATION**** 3. Add NOT NULL constraint, commit to source control and deploy all the way to PROD ****AND TO EVERY DEV WORKSTATION**** 4. Re-link static data (if not already done so) In retrospect, perhaps the following is a better solution all round: PRE/POST DEPLOYMENT SCRIPTS (v2) 1. Unlink static data table 2. Add a pre deploy to: a. Check if target table is in the before state and that it already holds data b. If so, create a new table called OriginalTableName_Temp c. Copy all data to new temp table d. Truncate original table 3. Add a post-deploy to: a. check if OriginalTableName_Temp exists b. if so, copy all data, including new default data for new NOT NULL col into original table (by the time this script runs, the new col should exist). c. delete OriginalTableName_Temp 4. Commit your new pre- and post-deploy script, along with your new NOT NULL column as a single commit. 5. Deploy this change to all environments, including prod and all dev workstations 6. Re-link static data * For now, you probably need to manually patch all the other dev workstations. Sorry I forgot to include dev workstations in my original answer. Forgive me, I'm a only fallible human. :-) / comments
Fair point, i suppose my instructions for option 1 should have read:UNLINK/RE-LINK STATIC DATA1. Unlink static data2. Deploy new column as NULLABLE all the way up to prod ****AND TO EVERY DEV WORKS...
Alessandro is spot on. Essentially you need to balance the need to manage all DBs as a single unit (because dependencies) and the need for agility (because if every DB is managed independently you don't need to get everyone to agree to release everything at once). In the centralised source control (TFTC/SVN etc) world folks tended to have bigger repos, but in the distributed (git) world folks tend toward smaller, isolated, dare I say, "microservices". Basically, if you plan to release DB updates for different dbs independently, you should probably have a repo per DB... Unless that's totally impractical because dependencies. And if that's the case you will probably need to group DBs together, but its also a good sign that you should try to remove some of those dependencies. / comments
Alessandro is spot on.Essentially you need to balance the need to manage all DBs as a single unit (because dependencies) and the need for agility (because if every DB is managed independently you d...
Oof, that sounds fun. Good luck! / comments
Oof, that sounds fun. Good luck!
You are right, that is nuts. EITHER: One DLM Dash install per team and the team is responsible for maintaining it - but that relies on each individual team having a neat division of responsibilities and not needing to look after more than 50. OR: Set up an automated process to drift check before deployment. https://documentation.red-gate.com/sca3/automating-database-changes/automated-deployments/handling-schema-drift OR: Build your own thing with triggers: https://www.red-gate.com/simple-talk/sql/database-administration/database-deployment-the-bits-database-version-drift/ OR do several of those. But 13 instances of DLM Dash really is nuts if there are people who will need to look at many different instances. / comments
You are right, that is nuts.EITHER: One DLM Dash install per team and the team is responsible for maintaining it - but that relies on each individual team having a neat division of responsibilities...
Hi Annette, Off the top of my head, try: $project = "C:\Work\scripts"
$targetDb = New-DatabaseConnection -ServerInstance "test01\sql2014" -Database "Test" $options = "ignoreadditional" Sync-DatabaseSchema -Source $project -Target $targetDb -SQLCompareOptions $options See example 6 here: https://documentation.red-gate.com/sca3/reference/powershell-cmdlets/sync-databaseschema And further docs here: https://documentation.red-gate.com/sca3/automating-database-changes/automated-deployments/using-sql-compare-options-with-sql-change-automation-powershell-module / comments
Hi Annette,Off the top of my head, try:$project = "C:\Work\scripts"
$targetDb = New-DatabaseConnection -ServerInstance "test01\sql2014" -Database "Test"$options = "ignoreadditional"Sync-DatabaseSch...
Not sure why that's not working, one for support I guess. In the meantime, reverting to the raw SQL Compare command line is my de facto quick fix where possible. / comments
Not sure why that's not working, one for support I guess.In the meantime, reverting to the raw SQL Compare command line is my de facto quick fix where possible.
The way it works behind the scenes is documented here: https://documentation.red-gate.com/soc6/reference-information/how-sql-source-control-works-behind-the-scenes If this was a fresh link to source control I normally see it come up as a conflict, since SQL Source Control does not know whether the source control or database version is the most up to date. / comments
The way it works behind the scenes is documented here:https://documentation.red-gate.com/soc6/reference-information/how-sql-source-control-works-behind-the-scenesIf this was a fresh link to source ...
You've already found the page to download old versions, but you are right - DLM Automation v2 doesn't seem to be there. If you want the latest version you should instead be looking for SQL Change Automation. It was renamed a few months ago. More info here: https://documentation.red-gate.com/display/SCA3/DLM+Automation+licenses+and+SQL+Change+Automation / comments
You've already found the page to download old versions, but you are right - DLM Automation v2 doesn't seem to be there.If you want the latest version you should instead be looking for SQL Change Au...
This is typically a much bigger problem than just buying a tool - it's how you use it. Yes, in some cases you need to write the scripts to deliberately work around some specific scenarios - this could either be handled using pre- and post-deploys or switching to a migration based source control solution, such as SQL Change Automation. (I mean by using the VS extension that was formerly called ReadyRoll INSTEAD of SQL Source Control. You will still use SQL Change Automation for deployment in either case. Redgate recently renamed stuff and now it's way harder to explain this.) Generally, however, downtime is caused because of dependencies between different systems that need to be upgraded at the same time and the system cannot run while some parts have been deployed and others have not. In that case you should look at the 'expand-contract' pattern and consider the wider reaching implications for the way your teams work together and co-ordinate your deployments. You may find these resources valuable: https://martinfowler.com/bliki/ParallelChange.html https://medium.com/continuousdelivery/expand-contract-pattern-and-continuous-delivery-of-databases-4cfa00c23d2e Apologies in advance - this isn't easy. / comments
This is typically a much bigger problem than just buying a tool - it's how you use it.Yes, in some cases you need to write the scripts to deliberately work around some specific scenarios - this cou...