Hi,
Do you have a timeframe for improved handling of large tables in SQL Change Automation?
Harald
0

Comments

4 comments

  • Kendra_Little
    Hi there,

    What kind of problems are you encountering with large tables? 

    Since we support several ways of working, it would be helpful to know the extension you're using to author the changes (SQL Change Automation for Visual Studio, SQL Change Automation for SSMS, or SQL Source Control), as well as the scenario where you're hitting the problem.

    Kendra
    Kendra_Little
    0
  • haur
    I use SQL Change Automation for Visual Studio, and I get out of memory error when handling automatically generated migration scripts to populate large static tables. I am aware of the workaround to use bulk insert, but this seems to introduce a lot of extra, not automated work. 
    haur
    0
  • way0utwest
    What's large? This is intended for lookup type datasets that tend to be hundreds or thousands of rows. When getting to GB of data, this is not the type of technology to use.
    way0utwest
    0
  • haur
    Your documentation concerning this known issue handling population of large static tables, large tables are defined as 10,000+ rows (https://documentation.red-gate.com/rr1/key-concepts/data-population/static-data).

    Is your recomendation that SQL Change Automation is not the technology to use for these datasets?
    haur
    0

Add comment

Please sign in to leave a comment.