Should we write internal proc? Just for code readability.

Hi All,

Need your inputs on -

Should we break our code into internal procedures even not going to be used twice anywhere.
Is it good or not?
Due to look nice & readability, should we break code into internal proc or good to add comments and Indentations?
 
Last edited:

TheMadDBA

Active Member
Short answer: Yes.

Longer answer: Internal procedures aren't just for reuse within the same procedure. IMO clarity is much better than a top down program, plus it is much easier to identify the true buffer scope with internal procedures and harder to accidentally change the buffer scope (assuming you define local buffers in the procedures, which I recommend in most cases).

Also... make sure to add comments and indent your code even with internal procedures :)
 

GregTomkins

Active Member
One way to look at this is that by dividing code up and using properly named variables and IP's, comments are less necessary. Eg., "RUN calculate_order_totals" vs. /* This is where we calculate order totals. */.

Of course there is no guarantee that the names are accurate, but in my experience most comments are garbage ... but people tend to at least TRY to keep names accurate.

There is a whole book by Bob Martin called 'Clean Code' and this is one of its major theories.
 

TomBascom

Curmudgeon
What version of Progress are you coding with?

Unless it is (very) ancient, obsolete and unsupported you have OO techniques available to you as well. Breaking your code into classes, methods and properties could be an even better approach.
 

RealHeavyDude

Well-Known Member
Not long ago I had a apprentice who said to me that no method ( internal function or procedure for that matter ) should contain more than 10 lines of code. The code she produced did not sit well with me. IMHO it is all about readability, maintainability and performance as to how you split up your code.

For example: Due to performance considerations it does not make sense to split the code within a for each or a query into different functions / internal procedures or methods. You will wind up having calls on each iteration which might be costly. But, on the other hand there might be good reason to do so due to other circumstances.

You see there is no Golden Rule and even if there are there is nothing stopping you from producing bad code. There is also nothing stopping you from producing bad code by following rules.

But as TheMadDBA already pointed out: There is one thing with which you can do yourself a big favor: Isolating code that manipulates database or temp-tables in internal functions / procedures or methods and using named buffers. That way it is very easy to have a clean and small transaction scope and buffer scope does not exceed or is equal to transaction scope - preventing you from producing share locks ( and dead locks in the process ).

Heavy Regards, RealHeavyDude.
 
Thanks everybody for the suggestions. What I understood is -

We can break code into internal proc just for clean code and readability.
Yes Transaction & Buffer will be scoped to the internal proc if we are defining buffers in Internal Proc.

In my scenario - We already opened all required bufer in the begning of the .p, based on those buffers writing one XML. In between there is logic to write particular tag around 50 lines of logic. Is it good to have that peice in Internal proc and writing tag in main .p. Its not at all realted to reusability.
50 lines logic will access all main .p variables and buffers in internal proc if we move that logic into internal proc.
 
What version of Progress are you coding with?

Unless it is (very) ancient, obsolete and unsupported you have OO techniques available to you as well. Breaking your code into classes, methods and properties could be an even better approach.
Thanks Tom for the reply.
Version using 11.3, but not using OOPs concepts, as its all existing code around 30 yrs old. Still all programs (97%) are based on top to down (procedure) programming appraoch.
 

TheMadDBA

Active Member
OO is a pretty big step for most Progress/OpenEdge shops. Too many of the developers are barely getting by with the ABL concepts and don't know much if anything about OO. Management doesn't seem to want to spend the time and money to educate them.

Sad, but true.
 

TheMadDBA

Active Member
True, but getting management to get on board is still a challenge. Current location could benefit from OO in many ways but probably around 3-4 programmers out of the 60ish actually understand it.

They are reluctant to write any OO production code because they know they will be stuck supporting it forever.
 
OO is a pretty big step for most Progress/OpenEdge shops. Too many of the developers are barely getting by with the ABL concepts and don't know much if anything about OO. Management doesn't seem to want to spend the time and money to educate them.

Sad, but true.
100% agreed.
 

tamhas

ProgressTalk.com Sponsor
They are reluctant to write any OO production code because they know they will be stuck supporting it forever.

The irony, of course, is that the primary motivation behind OO was reduced maintenance costs. Taking bits of your application and encapsulating them in objects would *simplify* downstream maintenance, not complicate it.
 

TheMadDBA

Active Member
Again true, but some shops are still coding in V10/V11 like it was V6 because they have decided (poorly) that it makes life simpler for them. Even to the point where readkey/editing blocks are still a thing.
 

tamhas

ProgressTalk.com Sponsor
Understood. But, "simpler" here is not really simpler over the life of the application. It is only simpler in that one merely copies what is there and doesn't have to learn anything. There are many things that one can do by simply shifting one's attitude to make new code better that will gradually improve the application and result in greater stability and maintainability going forward. Case in point a zillion years ago I created a super procedure to replace the massive blocks of shared variables in include files in my application. By simply having the menu system launch the super as well as populating the shared variables, it meant I could approach any new code or old code I was going to work on and just rip out the includes and sprinkle in references to the super as needed. It made the code far cleaner and easier to maintain going forward since one didn't need to worry about where all those shared variables were being set and used. In many cases, the incremental effort to introduce such improvement is trivial and may even save work in the initial implementation because of the clean interfaces it introduces, making relationships much more clear.
 

TheMadDBA

Active Member
Yes, I agree that most shops should be moving to more modern techniques and versions. That was never my argument, just pointing out the reality of a lot of Progress shops.

We all know that some companies just don't look at things the right way, hence all the people still on V8 and V9 running on ancient hardware without support.
 

tamhas

ProgressTalk.com Sponsor
There are many common mistakes that are made, but which seem to the companies that make them to be reasonable at the time. A classic is to decide to drop off of maintenance because one has customized the source too heavily to update to a VAR's newer versions and the VAR doesn't support an old version of their code on a new version of Progress. Then, some years down the line they hardware is old and tired and expensive to maintain and it costs them a new license just to do a platform change. There is also the classic pattern which stems from ABL having a low total cost of ownership which, instead of people being more willing to invest some of the savings in improvements, seems to make them cheapskates who hire the minimum staff to not quite keep up with the demand for changes, thus leaving no time for improvement. What I wish I could get across to these companies is the false economy. Spending something on making the product better will actually make it cheaper to maintain over the long run and will make it more stable and predictable, lessening the likelihood of expensive outages and errors. There are a lot of people who don't seem to want to hear the message, though.
 

TheMadDBA

Active Member
If you figure out a way to get the message across some day, please let us all know :)

Part of the problem is that the existing solution is often viewed as "good enough" for the company, and being proactive isn't usually taken into account for the managers bonuses but keeping the budget low is.
 
I thing if we are calling internal/external procedures only for code readability and not using the advantage of transaction/buffer scope (in internal procedure) then we should use include files instead for calling procedures(to save switching time).

Thanks!
Rajat.
 
Top