Comments
Sort by recent activity
Generally you want to install it on a separate machine that can talk to the SQL Server. You don't want to let DLM Automation hog the prod SQL Servers CPU and memory. Also, you can potentially just configure one deployment machine for multiple targets. Might help with all the firewall-foo. / comments
Generally you want to install it on a separate machine that can talk to the SQL Server.You don't want to let DLM Automation hog the prod SQL Servers CPU and memory. Also, you can potentially just c...
The release isn't just the update.sql file. It's a collection of objects including a copy of the source and target states, a diff report and an xml report of the warnings. These other objects are used for various purposes, such as drift detection and reporting for approval processes. Drop the "\update.sql" from the end and it will probably start working. / comments
The release isn't just the update.sql file. It's a collection of objects including a copy of the source and target states, a diff report and an xml report of the warnings.These other objects are us...
Ah yes, That's also a good solution - and aligns better with Octopus paradigms. It just means you have a lot more clicking to do in the Octo config - but your way will get you more of the Octopus good-ness so its probably the more natural solution. Good luck! / comments
Ah yes,That's also a good solution - and aligns better with Octopus paradigms. It just means you have a lot more clicking to do in the Octo config - but your way will get you more of the Octopus go...
No probs. Good luck. Reach out if you want some help. :-) / comments
No probs. Good luck. Reach out if you want some help. :-)
How's your PowerShell? The step templates don't allow for it but the raw PowerShell does. https://documentation.red-gate.com/dlma2/cmdlet-reference/use-dlmdatabaserelease You'll want to take an input variable containing a list of target DB names and then you'll want to create aalist of DlmDatabaseConnection objects for each target DB. Then you'll use Use-DlmDatabaseRelease to deploy your release to each DlmDatabaseConnection object in the list. Let me know if you have any Qs or want some help. / comments
How's your PowerShell?The step templates don't allow for it but the raw PowerShell does.https://documentation.red-gate.com/dlma2/cmdlet-reference/use-dlmdatabasereleaseYou'll want to take an input ...
Not sure if it makes you feel more comfortable, but the transaction handling is managed by the open source tSQLt layer, not the Redgate layer. So you can fork it and/or contribute your own patches if you prefer. 😉 / comments
Not sure if it makes you feel more comfortable, but the transaction handling is managed by the open source tSQLt layer, not the Redgate layer.So you can fork it and/or contribute your own patches i...
Forgive me if I'm missing the point, but doesn't tSQLt handle this for you out of the box? You shouldn't need to handle transactions within the test sproc itself because the tSQLt.run will roll back the transaction after executing the test anyway. That's why stuff like FakeTable is safe. / comments
Forgive me if I'm missing the point, but doesn't tSQLt handle this for you out of the box?You shouldn't need to handle transactions within the test sproc itself because the tSQLt.run will roll back...
Glad you figured it out. That sounds like a good plan! / comments
Glad you figured it out. That sounds like a good plan!
Yes - that could well be the issue. As a rule of thumb, if the static data table is over 1,000 rows expect an impact on performance. If the table is an order of magnitude bigger consider using a different strategy. However, if this is the issue there is a trick you can use to give you a significant performance boost: Setup tab > under Options just for this database, disable checks for changes to static data. Now the source code will still include the static data, but you have turned off the comparison by default. Now that static data will stop slowing down your refresh on the commit/get latest tab. Crucially, however, it will no longer notify you if the data changes. You will need to head back to the settings tab and flip it back on if/when you want to commit or pull down data updates. Hence, this fix will boost performance, but will mean your team need to communicate any static data updates with each other and manually push them up/pull them down. Also, this setting is individual to each dev machine. Hence, if using the dedicated model, each developer will individually need to flip the check back on, pull down the data, and flip the check back off again to get their performance back. / comments
Yes - that could well be the issue. As a rule of thumb, if the static data table is over 1,000 rows expect an impact on performance. If the table is an order of magnitude bigger consider using a di...