Question How much free information is too much?

TheMadDBA

Active Member
TL;DR version: Should we hold back intellectual property to avoid cutting ourselves out of work?

Just looking for some input from the community here... especially those that provide free tools/guides to the general public (Progress related mostly) and also do consulting in the same space as those tools.

When deciding to put information out there do you purposely limit what is freely available? I am considering publishing some of my tools/tricks for application performance tuning but I am not sure how much of that I want to disclose.

For example I have a tool that uses VSTs for table/index activity combined with dbanalys output and guesses which tables/indexes should be in which buffer pool, do they need to be moved to a new area,which tables are likely to have indexing issues based on number of reads compared to table size,etc. Of course it isn't perfect but it usually comes up with pretty good guesses assuming the sample size is large enough.

I also have similar tools for other common issues for application and DB performance. I personally use them to quickly identify issues when I inherit a new application or a s a framework to implement application caching and performance logging.

Of course none of these are going to replace a qualified person but at what point do you hold back on publishing tools that guide end users to fix common issues without having to call you in for help?
 

Rob Fitzpatrick

ProgressTalk.com Sponsor
I'm not a consultant, and by the sounds of it my tools are nowhere near as sophisticated as yours, but I'll chip in my 2 cents.

Tools can be great, both to make knowledgeable users more efficient and also as a learning aid for those who lack knowledge. But if you sit a novice in front of promon or ProTop or OEM and say "slow application; fix!", they're still lost. It's one thing to know the value of buffers flushed per second (or any other metric). It's quite another to be able to take in a raft of metrics, sift the signal from the noise, and then apply reason, intuition, and experience to devise a plan to isolate the problem, find the root cause, and apply the appropriate fix.

I learned years ago in operations that there's a downside to automation. Sure, it eliminated drudge work from my staff's lives and it reduced careless errors, but in some cases it meant they went from knowing how to execute processes to knowing how to run a succession of scripts. And if the script broke somehow they were screwed (and I got the 3 AM call).

The point is that most people aren't going to tear apart the tool and try to understand how it works and why it was written the way it was; it's a black box. So they are unlikely, simply by virtue of having the tool, to ascend to the level of the toolmaker. Using ProTop (or your tools) doesn't turn the user into someone who can competently code against the VSTs and meta-schema or understand DB internals.

In short, people with expertise still matter. They won't be obsoleted by their creations.
 

TheMadDBA

Active Member
Thanks for the input guys.

I am mostly concerned about the lack of business with tools that give more guidance to the end user as opposed to just reporting the facts. Not that they would ever tell them enough to fully tune an application or even tune all application issues but they are pretty good at finding the low hanging fruit.

I guess I will pretty them up and put them out there and see what happens :)
 

tamhas

ProgressTalk.com Sponsor
Professional Services organizations typically treat their tools as intellectual property and it is necessary to hire their services to gain access to the tools. I have seen no evidence that this has resulted in any more business and, if it has, I'll bet it had to do with a pretty spectacular set of tools.

For anything less spectacular, my bet is that keeping the tools proprietary will very, very rarely result in someone hiring the services of the toolmaker, whereas some of the time, making the tools freely available will cause someone to hire the toolmaker for enhancement, application, interpretation or whatever. While it is true that people may use the tools and not compensate the toolmaker in any way, those are most often companies that wouldn't hire anyone anyway. And, if they use and benefit from the tools, they may spread the word.
 

Cecil

19+ years progress programming and still learning.
I generally like to share code will help other developers where the ABL language is lacking in functionality. If I write cool cool little function written in pure ABL and than that functionality is freely available in other available languages like PHP, C, Java, Python etc, then I think that is okay to share.
 

TheMadDBA

Active Member
So I am getting close to what I would consider an alpha/beta release... anybody out there willing to give it a once over and give me some feedback?
 

TheMadDBA

Active Member
Ok... be gentle but still let me know where you find issues or have questions :)

I still have a few features to incorporate and I need to install 11.5 for some of the VST changes but it should work on 10.2B through 11.4. Possibly even earlier versions of 10 but that is still to be tested.

It works best when you have a dbanlys output file and are connected to a database that has been up and running for a while. You can still get some information out of a dev database and without the dbanalys but it really does cut down on some of the suggestions.

Input Parameters:
1) Path to a file that contains the output from a fairly recent proutil -C dbanlys
2) Path to the HTML file the code will create
3) Comma delimited list of tables to scan to help determine the benefits of moving to another RPB (or from Type 1 to Type 2 areas). "*" for all or "" for none.
4) Only scan tables where the record count is less than or equal to this number.

Note that even if you pass in "*" for the list of tables it will still only read the records if it has determined it makes sense to move the table (all Type 1 tables or Type 2 with bad RPB settings).

I would run it first like this and then consider turning on the last two parameters depending on how many tables it decides you should move :)

Code:
RUN advisor.p
(INPUT "path to proutil -C dbanlys output",
 INPUT "html output file name",
 INPUT "",
 INPUT 0).
 

Attachments

  • advisor.p
    107.3 KB · Views: 10

Cringer

ProgressTalk.com Moderator
Staff member
I'm just generating a new dbanalys file and then I'll give it a test (11.2.1, but 11.5 dbanalys output - should be a good test!). Plus I know I have a shed load of structure changes so it will be nice to see what it thinks!
 

Cringer

ProgressTalk.com Moderator
Staff member
Here is my output (set as txt so I can upload here). I'm interested it's not suggesting any tables for the secondary buffer pool. Also, the output doesn't seem to differ if I set the third parameter to "*".
 

Attachments

  • analyse.txt
    508.1 KB · Views: 16

TheMadDBA

Active Member
Thanks Cringer.

I guess I should add a warning or maybe change the behavior but you have to change the 4th parameter to a limit as well. I just didn't want it to start scanning huge tables.

Also looks like it had an issue parsing your dbanlys file since none of the record counts are populated. If you could send me the dbanlys file I would appreciate it.

And I suppose I need to refine my B2 logic a little bit. Right now most of the estimates are fairly conservative (95% of activity are reads and 2% of DB activity). I will say you have some of the most balanced table IO that I have seen in a while. Looks like you have already done quite a bit of tuning :)
 

Cringer

ProgressTalk.com Moderator
Staff member
dbanalys is attached. It's from 11.5.
Flattered about the tuning comments. I think it's more a case of misadventure than tuning! I have a list as long as my arm of tuning I want to do!
 

Attachments

  • dbanalys.txt
    805.7 KB · Views: 3

TheMadDBA

Active Member
Thanks for the file. Looks like the partitioning changes broke my parse logic. I will fix it and upload it again. That should also correct the B2 issues since record count is part of the calculation. Probably time to add some more safety checks/warnings since it didn't output the warnings about not having the dbanlys stats.

Since your areas appear to be set up per RPB setting I don't suspect it is going to find any new suggestions to move tables around.
 

Cringer

ProgressTalk.com Moderator
Staff member
Areas are set up per RPB but I do know that a lot of them are in the wrong RPB, wasting a lot of space in some cases. :(
 

TheMadDBA

Active Member
New version that should resolve the dbanlys issue.

Also playing around with the style sheet a little. Graphic design just isn't my thing :)
 

Attachments

  • advisor.p
    110.2 KB · Views: 10
Top