Jon, Thanks for the response and the link. I am disappointed and the article you linked has some interesting concepts, but storing scripts that are better generated automatically is a mistake. Here is the technique that I'm working on. Start with the a schema build script with comments inline:
/**
* <table name="AppSecurity.IdentityUser">
* This is the master list of users that can possibly access any of the applications
* in this database. The UserPrincipalName is owned by the Identity Server and is assumed
* to be globally unique. We cannot parse the UPN, only use it for indexing and foreign
* keys to uniquely identify a specific user. Note that it is possible for all of the
* other fields (except UPN) to be the same for multiple users.
*
* Currently we are not planning on folding any users between different sources, and we
* are counting on the identity providers (AD, Auth0, Google, etc.) to make sure the
* same user isn't represented twice, or if they are they really should be. Note that
* the UserName/SourceIdentityProvider *should* be unique, but we cannot enforce this
* and since we are never providing login services for these users these fields are only
* stored for logging purposes.
* </table>
*/
CREATE TABLE AppSecurity.IdentityUser
(
UserPrincipalName NVARCHAR(255) PRIMARY KEY CLUSTERED --- The UserPrincipalName that will come from the Identity Server to uniquely identify users who can/have logged in.
,UserName NVARCHAR(255) NOT NULL --- The provided username that the user used in the identity provider to login, stored for documentary purposes only. This value is most likely identical to e-mail but this is not guaranteed or enforced.
,SourceIdentityProvider NVARCHAR(200) NOT NULL --- The provided source that vouches for this user. The quality of sources is variable (how trustworthy are they).
,FullName NVARCHAR(255) NOT NULL --- The provided FullName of the user that logged in, we cannot make any assumptions about the format of this column.
,Email NVARCHAR(100) NOT NULL --- The provided e-mail address of the user that logged in. This value is most likely identical to UserName but this is not guaranteed.
);
It's a little difficult to see in the code above, but all of the inline comments are using three dashes "- - -" instead of the required two, which will distinguish the document comments from normal SQL comments.
My goal is that the documentation is inline with the well-formatted table definitions and is easily parseable. It will be checked into source control and managed from there. Update scripts could update the comments for a single column or constraint. I am writing a utility that will go through the schema build script and generate code similar to that found in the article, though probably I won't assume a stored procedure exists. That way the schema can be updated without someone having to edit or read any SQL relating to the extended properties, since you can generate and run an update script dynamically, there's no point in storing that.
My working plan is that the comments may contain HTML, which will be dutifully copied into final documentation for the schema. The other choice is to support storing markdown and convert that on the fly.
It would then be possible to create a schema exporter that read out the extended properties and produced a suitably formatted script for automated updates. I don't know yet how SQL Doc will fit into this picture.
/ comments
- Community
- SQL Doc
- Starting from scratch, best practices, no apparent way to ingest comments
Jon,Thanks for the response and the link. I am disappointed and the article you linked has some interesting concepts, but storing scripts that are better generated automatically is a mistake. Her...
0 votes