Help: Search Calling Programs of Include Files

Reynold

Member
Hello All,

Could anyone help me - to find the file names (in a particular directory) which uses some particular include file.
I am using progress 10.2B and Unix evn.

E.g. I am in a directory say /rd/src/
This directory contains 100 .p's and 10 include files - total of 110 files
Now I have to search one include file say abc.i to be called from how many .p's and what are the names of those .p's using a progress 4gl code.

I know I can do it by using grep command of unix. but I want to do it by progress 4gl code.

how can I achieve this using progress 4gl coding. Please help.

Thanks.
 

mrobles

Member
Hi.

I made a little documentation system and the way i used is:
1.- Read the directory(s) where the programs resides
2.- Compile .W and .P programs with xref. COMPILE VALUE(prog1) SAVE XREF VALUE(txt_file)
3.- Analize each line of the txt_file with progress.
If the line contains 'include' then next word is the include file.
My directories do not contain in its name 'include'
In the same way i analyze the txt_file looking for 'RUN' or 'PROCEDURE'

MRobles
 

Reynold

Member
I don't think that will be a good solution to compile each file and then serach each files xref file. Because in actual I have 1545 files in the directory and it will also take a lot of space to save each files Xref.

Could anyone suggest something other than that with some code snippet. I will really thankful to you.

I thought I would get some solution with so many senior Progress Masters for this simple question.

Helppppp plsssss.
 

rzr

Member
save your grep to a file, load file names to a temp-table (load only unique file name)... and search the temp-table for your file.
 

Stefan

Well-Known Member
I don't think that will be a good solution to compile each file and then serach each files xref file. Because in actual I have 1545 files in the directory and it will also take a lot of space to save each files Xref.

We compile over 6000 .p's every night with the xref option and then read the results into a database which is then accessible via a simple webspeed interface. Yes the database (for five or so versions) is now about 750mb, the xref file generated is over 1 gb - is this a lot of space? How much does an hour of searching / not finding this kind of information cost?
 
Top