R
Rob Fitzpatrick
Guest
This will be a bit vague... It is fairly well known that the SQL query engine uses a cost-based optimizer, and the standard first reply to anyone complaining of SQL query performance issues is "Have you updated statistics?" followed by the equally standard response "How do I do that?" and maybe "After I do it once when should I redo it?". This is a bit of a murky area. There is a way documented in the KB to generate code to create a SQL update script. But it is not exactly an elegant solution and there is no tooling support. At the very least a "Generate UPDATE STATISTICS script" utility in the data dictionary would be nice. I don't know of any easy way to determine the state of the SQL statistics. There are no canned DD reports on this. I can query the _Sys*stat tables or look at their info in a dbanalys report but that doesn't tell me how good or bad the data is in terms of optimizing queries. I've heard various things like run the update monthly, or run it when x% of your data has changed (how do you measure *that*?), etc. But I have no empirical way to know whether that advice is valid. So my "idea", such as it is, is that this is an area of the product that could do with some attention. Maybe you could automate updating the statistics, or provide a more streamlined manual process. Maybe you could provide guidance on when to update, or some insight into the quality of the metadata. Please give it some thought. It's a problem that affects many customers and partners that I talk to.
Continue reading...
Continue reading...