How can we help you today? How can we help you today?
EdCarden
fionag I do still have an active support plan for SQL Monitor so the upgrade to version 3 is not an additional cost in terms of dollars just in terms of time and effort which is a resource I am short on currently. That’s why I prefer a solution to sticks with not upgrading at least not for the short term. Then again a fix for version 2.3 that is lengthy and involved would be equally undesirable so in the end I may not have a choice but to upgrade. That said because you are not able to reproduce the problem in 2.3 that leads me to believe that my own problem with SQL Monitor getting/storing incorrect log size values will not go away after an upgrade making the upgrade effort of no value in as far as this issue goes. I have given you the T-SQL I use to get the Log file metrics and so the only question left to answer is verifying what value(s) I am storing and the answer is all. I store all values returned by the code. The size I quote is form the ‘Log Size (MB)’ column from the results the DBCC call returns. I capture this info every 15 minutes which based on a prior posting by you, is less often then when SQL Monitor captures the log metrics so if anything my own stored results should be less accurate than those of SQL Monitor. I will await to hear back from you on what you find out once you’ve completed the rest of the steps you’ve outlined in your last post. Thanks for your help Ed / comments
fionag I do still have an active support plan for SQL Monitor so the upgrade to version 3 is not an additional cost in terms of dollars just in terms of time and effort which is a resource I am sho...
0 votes
fionag wrote: Ed We get the values from a windows performance counter. You can see these by running Performance Monitor and selecting the counter "Log File(s) Size (KB)" from the "SQLServer:Databases" object. You can also see this value by running the query: select * from sys.dm_os_performance_counters WHERE object_name = 'SQLServer:databases' AND counter_name = 'Log File(s) Size (KB)' Out of interest, what query are you using to get the data? We collect quite frequently (minutely) but because it's considered a "stable sample" we only store the value if it is different to the previous one. Does this shed any light on the matter? Thanks Fiona Since there is no more followup on my replies to the above questions does that mean that the RedGate answer to this is "Upgrade To SQL Monitor 3" ? If I do upgrade to SQL Monitor 3 does that mean I will then see more accurate Log file metrics or does it mean that future capturing of Log FIle metrics will be accurate but past days (what has already been captured) will remain inaccurate? I believe the info being captured by SQL Monitor 2.X is wrong because it was not actually getting that data as frequently as it was supposed to. Thats because it seems to get the Log File Size right at certain points in time but is off, way off in some cases for the in-between times. Comments? / comments
fionag wrote: Ed We get the values from a windows performance counter. You can see these by running Performance Monitor and selecting the counter "Log File(s) Size (KB)" from the "SQLServer:Dat...
0 votes
Chris Auckland wrote: Hi Ed, We have a new build available, but I don't think it's going to help you. What they've added is a way to limit the number of objects that SQL Prompt caches in order to prevent SSMS crashing if SQL Prompt runs out of memory. I don't think this will help with performance. What would be really useful is if you could send me a blank copy of your database schema and I can try and reproduce the performance issue here. Is there any chance you could send it over? Chris Apolagies for not getting back to you sooner but I haven't had the time to dela with this again till now. I really wish I could send you the schema but that’s the one thing I can’t do because of an NDA (non-disclosure agreement) we have with the Vendor who’s accounting software system uses this database. The DB is not of our own creation but is party of the accounting software package we use. We have the ability to tweak certain things such as adding custom tables to allow for custom reporting but sharing the core DB schema is one of those big no-nos. I can tell you how many of what we have but I can’t provide the actual schema of the thing. Sorry. Does this mean that this problem is going to fall to the wayside, go unaddressed? I’m at the point where I’m having to disable SQL Prompt on a regular basis because of the lockups and slowdowns caused by the thing. Its not just when a table is added to the query nor when a query is first loaded. It’s like as if SQL Prompt tries to re-fresh or re-query the schema with every single keystroke. I’m sure I am exaggerating and that it’s not really doing this but it is most certainly locking up all of SQL Server Mgt Studio when it does this. For example I can type in a table join and be done but still have to wait anywhere from 10 seconds (on the low side) to 30 secs for the actual text I typed to appear and its because of SQL Prompt. How do I know its SQL Prompt? All the slowdowns and lock ups stop when I disable Code Suggestion’s in SQL Prompt. While I realize that our DB schema is large its not that ,much larger now then 2 years ago and yet the previous versions of SQL Prompt were nowhere this slow. I would have the occasional slow down or lock up but nothing like I am seeing now. Its very, very frustrating especially since Dependency Tracker 2 is also dog slow; the other RedGate product I have for use with SQL Server. I also have SQL Monitor and while its not slow it has its own drawbacks like no way to report on the data it collects; very frustrating. / comments
Chris Auckland wrote: Hi Ed, We have a new build available, but I don't think it's going to help you. What they've added is a way to limit the number of objects that SQL Prompt caches in order t...
0 votes