I am doing a comparison of tools/methods of moving 10 million rows of data from one Azure SQL Database table to another Azure SQL Database. 

Using Data Compare crashes with memory errors after about 1.8 million rows. I have modified the transaction to do in chunks of 500mb serializable. 

Is there any other settings I should modify to get better results?  

Should I change the settings to no transaction or a different isolation level?

just want to cover all the bases before I count out SQL Data Compare. 
gbargsley
0

Comments

4 comments

  • robrich
    This sounds really interesting.  What other tools are you trying?  I've used SQL Data Compare for data synchronization in the past, but as you note, it is indeed fragile and time consuming.
    robrich
    0
  • gbargsley
    I used SSMS import/export wizard. Azure data factory and Powershell module dbatools.
    gbargsley
    0
  • robrich
    I now tend to use PySpark in Jupyter notebooks for this as it's really easy to connect to many data sources and pull / push data between them.  The setup for Spark is a bit involved though.
    robrich
    0
  • Roseanna
    Hi @gbargsley, 

    I've raised this with our support team, who will be in touch shortly :) 

    Roseanna
    Roseanna
    0

Add comment

Please sign in to leave a comment.