Comments
Sort by recent activity
Sync-DlmDatabaseSchema uses SQL Compare under the hood. SQL Compare wraps the entire update into a single transaction. If there are errors at run time the transaction is rolled back. That is why your changes are not being deployed.
This is by design - it is to protect the user from ending up in a position where only half the changes have been deployed and they need to unpick the mess.
In order to solve your problem I propose two solutions. The first is quicker to implement, the second is more effort but safer: Option 1: Ignore transactions
Of course, you now lose the benefit of transaction, but this should be a quick and easy way to force the behaviour you want.
SQL Compare has various options that you can set, one of which is NoTransations (or nt). To run Sync-DlmDatabaseConnection with this option you could write a script that looks something like this:
$options = "NoTransactions"
Sync-DlmDatabaseSchema -Source $someScriptsFolder -Target $someDlmDatabaseConnection -SQLCompareOptions $options Option 2: Filter files
If you would like to only deploy specific objects but you would like to keep the security of transactions you should use filters to either filter the objects out of source control our your deployment. You can filter objects out of source control using the SQL Source Control GUI. To filter the objects out at deployment time you can use -FilterPath to reference a .scpf file. The command is documented here: https://documentation.red-gate.com/display/DLMA2/Sync-DlmDatabaseSchema
Creating a .scpf file is documented here: https://documentation.red-gate.com/display/SC12/Using+filters / comments
Sync-DlmDatabaseSchema uses SQL Compare under the hood. SQL Compare wraps the entire update into a single transaction. If there are errors at run time the transaction is rolled back. That is why yo...
Hi Anant,
Did user A and B commit their changes yet or are we talking about uncommitted changes?
The history is read from the source control system, in your case TFS. If user A and user B had not yet committed changes you just see the last TFS version vs latest db version and the most recent author's username in the commit tab. (This information is retrieved from the default trace).
If both users had committed to source control I would have expected both changes to show up in the history tab. If this is not happening probably you should speak to the dev team. (Mike U :-)
Of course, this does mean there is a possibility that one user might overwrite another persons code if the first developer had not yet committed to source control. If this concerns you take a look at the locking feature: https://documentation.red-gate.com/display/SOC5/Lock+an+object
However, a better fix would be to switch to using the dedicated model if possible. / comments
Hi Anant,
Did user A and B commit their changes yet or are we talking about uncommitted changes?
The history is read from the source control system, in your case TFS. If user A and user B had not y...
OK - sounds like either the product either isn't configured correctly or isn't working properly. I'll let you figure that out with the folks from Redgate.
However, I would pick up on your point about people changing the same objects at the same time. That's often exactly why people like to use the dedicated model. The dedicated model handles conflicts much more elegantly and avoids the risk that one developer will accidentally overwrite the other developer's code.
Imagine the database was C# source code. The dedicated model is a bit like everyone having their own checkout or locel repo - which is the conventional way of working. In contrast, the shared model is like sticking your source code on a file share and letting everyone edit it at the same time. / comments
OK - sounds like either the product either isn't configured correctly or isn't working properly. I'll let you figure that out with the folks from Redgate.
However, I would pick up on your point abo...
You can use SQL Compare Pro to compare your database to a set of folders and to update them. You can set scripts folder as either the source or target. https://documentation.red-gate.com/display/SC12/Working+with+scripts+folders
Are you using TFGit or TFVC? If you are using TFGit you can just use SQL Compare to update your local repo based on the DB version and commit/push. That should sort ouut your sync issues.
If using TFVC I cannot remember off the top of my head where the Git checkout gets mapped to and whether this will work. David or one of the dev team can probably help. / comments
You can use SQL Compare Pro to compare your database to a set of folders and to update them. You can set scripts folder as either the source or target.https://documentation.red-gate.com/display/SC1...
*Note, for my suggestion with Git it's probably simplest to commit/push with whatever git client you normally use. Whether that's Git Bash, SourceTree or something else. / comments
*Note, for my suggestion with Git it's probably simplest to commit/push with whatever git client you normally use. Whether that's Git Bash, SourceTree or something else.
What source control system are you using? Can I assume you were using a git repo on the local machine (and that local repo was lost)?
Is there a remote repo involved somewhere? On a server somewhere or in the cloud perhaps? / comments
What source control system are you using? Can I assume you were using a git repo on the local machine (and that local repo was lost)?
Is there a remote repo involved somewhere? On a server somewher...
Some people do. Works fairly well. But have you looked at SQL Clone? / comments
Some people do. Works fairly well. But have you looked at SQL Clone?
Lolz - looks like Rob C and I are on the same page. :-) @JDenham: Assuming you created your package from scripts maintained by SQL Source Control the package contains the desired state/model and the deployment, under the hood, will use SQL Compare to get the DB into the desired state. Hence, the NuGet packages do not contain your updates/migrations. These are worked out for you by the software.
Hence, as long as the database exists with the correct name you should not need any old packages. / comments
Lolz - looks like Rob C and I are on the same page. :-)@JDenham: Assuming you created your package from scripts maintained by SQL Source Control the package contains the desired state/model and the...
I like to stick an IF NOT EXISTS CREATE DATABASE script in source control for each of my databases (for each environment). This way I can source control the way I set up stuff like file groups and possibly security at a per database level. This is useful because some of that stuff is not included by default with the Redgate tools or people tend to filter it out. (Who has the same users on DEV and PROD?)
Note, add this create script in a parent or sister directory to your SQL Source Control directory. I try to avoid adding my own scripts to the directory you give to Redgate because you can confuse Redgate tools which are designed to parse all the sql scripts in that directory.
Then, I add a simple step in Octopus Deploy to run my IF NOT EXISTS CREATE DATABASE script before the "Redgate Deploy from Package" step. Then you know your DB will always exist. There is an Octopus Deploy step template to run a SQL script here: https://library.octopusdeploy.com/step-templates/73f89638-51d1-4fbb-b68f-b71ba9e86720/actiontemplate-sql-execute-script
This step template asks you to type the script into the Octopus GUI but it does also accept variable substitution so you could package up your SQL scripts into a different NuGet and reference them from that variable. / comments
I like to stick an IF NOT EXISTS CREATE DATABASE script in source control for each of my databases (for each environment). This way I can source control the way I set up stuff like file groups and ...
On a similar note, I'd like an option to only produce a changes.html file if there are changes.
Use case:
My customer has about 20 DBs. All tightly coupled so deployed within the same uber Octopus project. 20 changes.html files is annoying if only 2 or 3 DBs have been updated. I want to be able to only upload the relevant changes.html files as artifacts.
I'm guessing the answer is the same, parse the JSON, but a simpler solution would be appreciated. :-) / comments
On a similar note, I'd like an option to only produce a changes.html file if there are changes.
Use case:
My customer has about 20 DBs. All tightly coupled so deployed within the same uber Octopus ...