How can we help you today? How can we help you today?
Kendra_Little
Hello, Apologies for the delay, just catching up on this thread this morning. I believe we have shifted the discussion of question #4 to this thread now. Cheers, Kendra / comments
Hello,Apologies for the delay, just catching up on this thread this morning. I believe we have shifted the discussion of question #4 to this thread now.Cheers,Kendra
0 votes
Hello, 1. SQL Server doesn't provide a simple way to do this type of refactoring. What you are doing is changing the structure of the data which is being stored for the business key column of a table, which is an essential part of the table structure. You will need to remove foreign key relationships / other constraints, remove primary keys, make the column change, and then recreate the PK, indexes, and constraints. There are some scripts from around the web which may help you loop through tables and save some time on sites like StackOverflow. Needs tend to vary depend on the database schema you're changing, so it's likely you will need to customize the scripts to meet your situation. You don't have to worry about the individual scripts being perfect, though -- you just need to get your development database into the state you'd like it to reach in the end. So if you make a mistake, you can correct it. 2. This is going to vary based on the size of your tables, the nature/quality of the data you have in the columns, and the downtime you have available for deployment. There's no single answer to this. For very large tables, some might even choose to create a new table, insert the data into it, and do a switcheroo for performance reasons. (Here's an example of a very fancy approach that some use if they can't have downtime.) 3. Yes. When you have a set of changes made in a database that you would like to commit to version control, at that point you can generate migration scripts and import the changes, review them, and commit them. The SQL Compare Engine will analyze the objects which have been changed that you've selected to import and compare this with the prior schema which you captured in version control. It will script out the most efficient way to to make the version in source control reach the desired state of your dev database. You can review that code and tweak it as required.  4. Customers typically use some combination of orchestration systems such as Jenkins, Azure DevOps, Octopus Deploy, etc to automate builds, tests, and deployments of database code for SQL Change Automation. The functionality is also available in PowerShell scripts if you wish to execute the PowerShell directly. Hope this helps, Kendra / comments
Hello,1. SQL Server doesn't provide a simple way to do this type of refactoring. What you are doing is changing the structure of the data which is being stored for the business key column of a tabl...
0 votes
Hi Peter, This is a really great question, and I think we can develop some good content around this concept, and maybe even go a little farther. I think there are a couple of clarifying questions that might help me get my head around the best way to think through this. 1) How many total environments are in the mix outside of production? In other words, after the integration environment, is there a QA or Staging environment that things get deployed to before production? The reason I ask is that if we don't have another environment in the pipeline before production, then it may become more important to "reset" the integration environment and then only deploy feature 2 to it. (Even if the two things are feature isolated, the application might have some sort of dependency we wouldn't find without doing this.) 2) My second question was if we had the option to use something like Clone or a snapshotting tool to reset the environments in the scenario. It sounds like we do in this case, so I will pull that into consideration. 3) Are you working primarily with SCA in  Visual Studio or SSMS? If it's SSMS, then I'm curious if you've updated the plugin since Sept 23rd -- there was an update in that release which helps with removing migration scripts in terms of the schema model. (Visual Studio already had the ability to handle deletions of migration scripts.)   Thanks very much for the thoughtful question and looking forward to working through this scenario more on Monday. Have a great weekend. Kendra / comments
Hi Peter,This is a really great question, and I think we can develop some good content around this concept, and maybe even go a little farther.I think there are a couple of clarifying questions tha...
0 votes
Hello, SQL Change Automation uses a local database to validate that all your migration scripts can run against a fresh environment successfully  -- this is called the shadow database. What's happening here is that when the commands to create a memory optimized table run against the shadow, SQL Server raises an error because that shadow database hasn't been configured for memory optimized tables. There are two different ways you can handle this: Approach 1 - Clone as Baseline If you are also doing a trial of SQL Clone and/or already have Redgate's SQL Clone, you can use the "Clone as Baseline" feature.  With this approach, an image of the production database (which you may mask if you're using Data Masker) is used as the "starting point" for the shadow. All of the properties of the production database, like the configuration you have for memory optimized tables, are already present in that image / clone, so they carry through and this automatically works.  (Note: with this approach you would only need to create migration scripts for changes you want to make, you wouldn't need to create a migration script for existing objects at all.) Approach 2 - Pre-Deployment Scripts If clone as baseline isn't right for you, then you can use a pre-deployment script to configure database settings (like enabling memory optimized tables) for the verification and build process, which are run against empty databases. With this approach, variables will be very useful.  You will likely want to: Have "USE $(DatabaseName) " at the top of the pre-deployment script to set the context to the shadow Use the $(IsShadowDeployment) variable (or some other method of your choice) to ensure that the pre-deployment script only runs against the correct environments. Usually folks only want the database configuration scripts to run against verify and build environments. Note: If you are working with a Pre-Sales engineer as part of this process, they are very skilled at helping folks with this as well. Hope this helps, Kendra / comments
Hello,SQL Change Automation uses a local database to validate that all your migration scripts can run against a fresh environment successfully  -- this is called the shadow database. What's happeni...
0 votes