Comments
Sort by recent activity
I spent a lot of time with the SQL Data Catalog team, showed the product to my team at the day job, and provided some feedback based on other products I've seen. I enjoyed working with the team and the product and I'll probably write a new talk for next year about data cataloging. / comments
I spent a lot of time with the SQL Data Catalog team, showed the product to my team at the day job, and provided some feedback based on other products I've seen. I enjoyed working with the team an...
In the US, we have GDPR-like regulations emerging on a state by state basis. California is first, but 22 other states are in motion now too. Which means we could end up with 50+ different regulations (+ depends on what the federal government, DC, Puerto Rico and Guam do) to contend with. That's a lot to track manually, but a tool like this will help a lot. One of the big complications is what is personally-identifiable information. Under current regulations, your name alone is not PII, and your phone number alone is not PII, but together they are PII. That's hard to do with simple table scanning, but I think I'd add a label of "partial PII", and have a Powershell script which analyzes the results of a scan for these two elements, and then recategorizes them as PII. In my line of work, we also have use-by-source concerns. An oversimplified example is as a mortgage company, we deal with a lot of real estate agents, and we need to make sure we have both the correct licensing information as well as the correct contact information. We buy the licensing data from one source, and the only permissable use is for license verification. We'd also like to market to real estate agents to remind them that we offer great customer service. For this, we need to buy another list of data which doesn't guarantee (or even have license data at all). I can categorize data in different tables or databases with a use policy in a way which is documented with the other classifications, in one place.
/ comments
In the US, we have GDPR-like regulations emerging on a state by state basis. California is first, but 22 other states are in motion now too. Which means we could end up with 50+ different regulat...
Don't forget there is a github repo of snippets people had been contributing to, https://github.com/gvohra/sqlpromptsnippets. Maybe RedGate needs to have an "official" repo? / comments
Don't forget there is a github repo of snippets people had been contributing to, https://github.com/gvohra/sqlpromptsnippets. Maybe RedGate needs to have an "official" repo?
You ask as if it’s not still January... / comments
You ask as if it’s not still January...
Agree on including tests but excluding tSQLt framework. I've also worked with DBEs who want to use a separate test instance of their database because they don't want the framework in the regular pipeline because it won't be in prod. / comments
Agree on including tests but excluding tSQLt framework. I've also worked with DBEs who want to use a separate test instance of their database because they don't want the framework in the regular p...
Actually participated in a discussion on the Gartner forums about data masking products, was able to explain that RG Data Market was not like TDE and pointed out the Gartner study. / comments
Actually participated in a discussion on the Gartner forums about data masking products, was able to explain that RG Data Market was not like TDE and pointed out the Gartner study.
Thank you for the stuff! / comments
Thank you for the stuff!
Hey David! AWS Aurora may become a tool in our toolbox. We've been pushing our DBEs to automate deployments more and more, using SQL Change Automation. It pains me to think we'll have to fall back to SQL Change Manually for systems which choose Aurora. So I was wondering if the automated deployment capabilities would possibly be expanded into the MyQL and/or Aurora worlds. Thanks! Rich / comments
Hey David!AWS Aurora may become a tool in our toolbox. We've been pushing our DBEs to automate deployments more and more, using SQL Change Automation. It pains me to think we'll have to fall back...
We're finally getting a serious effort going. For background we have around 1800 team members just in technology, with somewhere around 50 database engineers (DBEs) shared across the development teams. Various development teams over the last couple of years tried their own initiatives, but the DBEs were reluctant to change. The DBEs saw it as their work to do, and there was no technology-wide initiative for CI/CD so it just never got done.
We now have a technology-wide mandate for CI/CD
• What are the key drivers for change – frequency of deployment, reduced overheads, data protection and compliance, reduce application downtime etc?
• Is the cost of investing in new technology a blocker, or is there an understanding that the eventual financial benefits will require investment in tech?
• How does the database become part of the conversation? Is it part of initial planning, or is it a hurdle along the way? / comments
We're finally getting a serious effort going. For background we have around 1800 team members just in technology, with somewhere around 50 database engineers (DBEs) shared across the development ...