My and a co-worker are both running SQL Compare 10.7.0.18.
From what I can tell our settings are the same.
If we compare the same two databases, let's say A => B
A is adding a new field to an existing table with existing data that is a BIT field.
When I run the compare SQL Compare 10.7.0.18 automatically sets a default value of (0) to apply to database B for that new field - which makes since otherwise there will be a problem since it has data in it already and we are saying that field can't be null.
However - we didn't specifically set the default value to (0) in database A.
When my co-worker runs the comparison with SQL Compare 10.7.0.18 on their machine - it does not automatically set the default value of (0) when it generates the script and when you try to apply it you get a warning and of course the script doesn't work.
Why does the script generate differently on two different machines?
From what I can tell our settings are the same.
If we compare the same two databases, let's say A => B
A is adding a new field to an existing table with existing data that is a BIT field.
When I run the compare SQL Compare 10.7.0.18 automatically sets a default value of (0) to apply to database B for that new field - which makes since otherwise there will be a problem since it has data in it already and we are saying that field can't be null.
However - we didn't specifically set the default value to (0) in database A.
When my co-worker runs the comparison with SQL Compare 10.7.0.18 on their machine - it does not automatically set the default value of (0) when it generates the script and when you try to apply it you get a warning and of course the script doesn't work.
Why does the script generate differently on two different machines?