Comments
Sort by recent activity
If you need different static data in dev and prod you could try something like this: https://www.red-gate.com/blog/database-development/customer-specific-database-deployments Other options: Use a post deploy script that calls a scproc. Sproc should be environment aware. Sproc should do something like: IF (DEV) {Make static data looklike this} IF (PROD) {Make static data look like that} This way it's all in source control - but if your static data tables are big that sproc will be massive. / comments
If you need different static data in dev and prod you could try something like this:https://www.red-gate.com/blog/database-development/customer-specific-database-deploymentsOther options:Use a post...
Or, if the static data is massive, don't use source control to solve the problem. Do it some other way. Source control doesn't like massive files. (Although if poss source control is best.) / comments
Or, if the static data is massive, don't use source control to solve the problem. Do it some other way. Source control doesn't like massive files.(Although if poss source control is best.)
Do you mean a table in the SQL Server master database? If so I don't have a good suggestion for you within Redgate SQL Source Control. You will need to manage this some other way.
Do you mean you want to insert data into a reference table or lookup table in your source controlled database? If so, assuming the table has less than about 1000 rows use the 'static data' feature: https://documentation.red-gate.com/soc6/common-tasks/link-static-data
If it is a bigger table either use a 'migration script': https://documentation.red-gate.com/soc6/common-tasks/working-with-migration-scripts
Or, if this is a regular task, consider using ReadyRoll or some other process instead. SQL Source Control 'migration scripts' work well for very occasional changes on small-ish databases but they don't scale very well to bigger databases and frequent use. / comments
Do you mean a table in the SQL Server master database? If so I don't have a good suggestion for you within Redgate SQL Source Control. You will need to manage this some other way.
Do you mean you w...
Given that you are using the shared model that's probably the best you can do.
However, I would highly recommend that you ditch the shared model and move to the dedicated model. Trying to use a shared development database with a distributed source control system is simply a contradiction. You will spend more time fighting the tools than benefiting from them.
I appreciate that decision might be out of your control, but perhaps sharing this blog post with whoever has influence over the decision might be a good starting point: http://workingwithdevs.com/shared-vs-dedicated/ / comments
Given that you are using the shared model that's probably the best you can do.
However, I would highly recommend that you ditch the shared model and move to the dedicated model. Trying to use a sha...
One thing you could do is select the dedicated option even though you are using shared. For various complicated technical reasons it will probably work fine. In order to explain that it might be worth actually scheduling a skype call to talk through it.
However, it's ugly and smelly and you should just move to the dedicated model. / comments
One thing you could do is select the dedicated option even though you are using shared. For various complicated technical reasons it will probably work fine. In order to explain that it might be wo...
2 options:
OPTION 1:
Write your own powershell rather than using the default step template.
For example:
$scriptsFolder = "C:\Work\scripts" #replace with the location of your nuget package, probably using one of the #[Octopus.action[stepname]... system variables
$test = New-DlmDatabaseConnection -ServerInstance "test01\sql2014" -Database "Test" -Username "sa" -Password "P@ssw0rd" # Username and password only necessary if using SQL Auth. ServerInstance and Database should probably be parameterised
$syncResult = Sync-DlmDatabaseSchema -Source $scriptsFolder -Target $test
Out-File $syncResult.UpdateSql -FilePath "C:\Work\update.sql"
(See example 4 in docs for details here: https://documentation.red-gate.com/dlma2/cmdlet-reference/sync-dlmdatabaseschema)
OPTION 2:
Use 2 steps:
1. Redgate - Create Database Release - https://library.octopusdeploy.com/step-templates/c20b70dc-69aa-42a1-85db-6d37341b63e3/actiontemplate-redgate-create-database-release
2. Redgate - Deploy from Database Release - https://library.octopusdeploy.com/step-templates/7d18aeb8-5e69-4c91-aca4-0d71022944e8/actiontemplate-redgate-deploy-from-database-release
The first step will create various resources, including a plain sql script and a nice html diff report in advance of deployment. Both will be uploaded as an artifact. You could optionally add a "Manual intervention" step in between the create and deploy steps to enable a review. Perhaps just scope that to the production/pre-production environment?
WHICH TO CHOOSE?
Option 1 is quicker/simpler but requires writing your own PS
Option 2 does not require custom scripting and enables review, but the script is created in advance and it's a slightly more complex process. Also, your deployments will take longer. / comments
2 options:
OPTION 1:
Write your own powershell rather than using the default step template.
For example:
$scriptsFolder = "C:\Work\scripts" #replace with the location of your nuget package, probabl...
If you don't include the temporary database switch it uses localDB on the agent as the defualt! ;-P
I guess the question is will your references resolve properly from local DB to the dependent databases. (e.g. if you used three part names - db.schema.table) then those dbs will also need to exist on localdb. / comments
If you don't include the temporary database switch it uses localDB on the agent as the defualt! ;-P
I guess the question is will your references resolve properly from local DB to the dependent data...
Point DLMA at a dev or test server that has all the dependent objects. It will build the database on that server, where all the dependencies will exists, and it will delete the database when done. / comments
Point DLMA at a dev or test server that has all the dependent objects. It will build the database on that server, where all the dependencies will exists, and it will delete the database when done.
You need to handle that separately. / comments
You need to handle that separately.