Comments
4 comments
-
Hi there,
What kind of problems are you encountering with large tables?
Since we support several ways of working, it would be helpful to know the extension you're using to author the changes (SQL Change Automation for Visual Studio, SQL Change Automation for SSMS, or SQL Source Control), as well as the scenario where you're hitting the problem.
Kendra -
I use SQL Change Automation for Visual Studio, and I get out of memory error when handling automatically generated migration scripts to populate large static tables. I am aware of the workaround to use bulk insert, but this seems to introduce a lot of extra, not automated work.
-
What's large? This is intended for lookup type datasets that tend to be hundreds or thousands of rows. When getting to GB of data, this is not the type of technology to use.
-
Your documentation concerning this known issue handling population of large static tables, large tables are defined as 10,000+ rows (https://documentation.red-gate.com/rr1/key-concepts/data-population/static-data).
Is your recomendation that SQL Change Automation is not the technology to use for these datasets?
Add comment
Please sign in to leave a comment.
Do you have a timeframe for improved handling of large tables in SQL Change Automation?