G
George Potemkin
Guest
Dmitri, One is browsing it like a DBA would do My experience says that DBAs do not check the logs until the end-users tell them that they have the problems. And then DBAs often use a primitive tool like "grep error *.lg". For example, there was a case when a crazy process was writing to a db log a few messages per sec and this lasted a few days. The daily db logs were huge but DBA did not notice this even when he uploaded the log on ftp for me to analysis the completely different issue. A typical call from large customer: Customer: Houston, we've got a problem. What data you need for analysis? Me: I need db logs and recent promon data. (Emptied the contents of the glass in one gulp.) I told you that a thousand times before. Customer: OK. Will be done in a few minutes. I don't blame them, they follow the boss' instruction: "in case of problems first to contact TS". May be PSC could come up with something that works for both, a human DBA reading the log for database issue and a script parsing the log for a specific code issues. PSC could reanimate the dead "key event" project - a daemon can parse db log for any "unsuppressed" messages and copy them to a separate file or send them to the specified email list. The "suppressed" messages is the set of any messages that we don't want to copy - like the connect/disconnect messages. Such feature would be a good "toy" for DBAs. But as a first line techsupport I need as complete set of data as possible. The customers call me when an issue is happening "right now" but a thousand mile away from me - at the site I do not have a direct access to. The less data I will get the more "false trails" I should check and these checks would just waste my time when the time is critically important because the tens thousands end-users are unable to do their job. Will PSC provide a tool to analys the db log or not - it's not a problem for me. I have my own tools. Regards, George
Continue reading...
Continue reading...