Progress 9.1D legacy server - no developer licence - need to dump data to .csv

Tony Woodhouse

New Member
Dear All,

I'm sorry to ask what I know is a ridiculous question, but I'm hoping someone can help. I've inherited a very old legacy Progress 9.1D server running a database and various client applications. It's not got a run-time only licence installed (e.g. no developer licence) and the original supplier of the server, database and applications has gone out of business many years ago. The application hasn't been used for many years, but there is an occasional need to query legacy data from the server.

I want to decommission this server for obvious reasons, but the business still needs access to some of the data. I've identified a few tables that contain all the necessary data and just want to dump them out to .csv or some similar format. I've spent a few very frustrating days with ChatGPT trying to learn how to use this server, but keep hitting problems I cannot overcome. There is an ODBC driver installed on the server, but it keeps crashing due to some records which have string values in certain fields which exceed the database schema size for the field, and cause the ODBC driver to crash. I've spent a day with ChatGPT trying various clever SQL commands to skip or truncate the corrupt records / fields, but nothing works, as the ODBC driver will always fetch the data and error prior to running any SQL commands that attempt to truncate or skip the broken records.

I've then spent another frustrating day with ChatGPT learning how to write .p scripts which would extract the data from tables and dump them to .csv but I cannot run any of these scripts as my run-time only licence doesn't allow for compiling the .p scripts to .r executables. I can connect to the database with tools like the Data Dictionary tool, but cannot find any way to extract data from a table or compile and run a script.

Clearly one solution would be to buy a developer licence (although I'm not sure I'd know how to install it on my server, and I've no idea if you can even licence 9.1D anymore). I really don't want to consider trying to upgrade to OpenEdge 11 etc. it just sounds like way too much hassle and expense, I just want to dump out my data from a few tables and get rid of this pesky server once and for all.

Does anyone have any suggestions? Are there third-party tools which can extract data from the ABL/4GL broker or which can compile a .p script for me? Do any of the standard Progress tools allow me to extract data without compiling scripts?

Best wishes,
Tony
 
A static dump can be knocked up in seconds if you have a dev license:

Code:
output to 'customers.d'.

for each customer no-lock:
   export customer.
end.

output close.

To be able to execute this without a dev license, you need to get someone with a dev license to compile this. But to compile this, this someone needs the database structure. This can be achieved by providing the database definition (.df) using Data Administration.

A dump can also be achieved using dynamic code which does not need the database structure, but there is no dynamic export method, so it all gets a bit more tricky - see Progress Customer Community

Progress should be happy to sell you a 9.1D dev license.
 
Rather than wasting your time with ChatGPT, you might consider hiring a consultant who actually knows how to do all this stuff.
 
Is that an offer? ;-) What are your rates? I was kinda hoping someone with a 9.1D developer licence might kindly offer to compile my script for me lol?
 
What version of the operating system?
Can you install the recent evaluation version of OpenEdge? Then you can convert your database and dump data using new version.
Do you have a run-time license installed? Then you can dump and upload the .df files. And somebody can compile the dump programs for you for V9.
 
What version of the operating system?
Can you install the recent evaluation version of OpenEdge? Then you can convert your database and dump data using new version.
Do you have a run-time license installed? Then you can dump and upload the .df files. And somebody can compile the dump programs for you for V9.
George, thanks for this.
It's running on Server 2012 R2.
I've requested an evaluation version of OpenEdge, I'll wait to see where that takes me.
I'm unsure how to generate the .df files which would be required for remote compilation of a .p script. I've got access to the Data Dictionary and Data Administration tools and can connect to my database. They are both locked down though due to only having a runtime rather than developer licence, so I can't do things like Admin > Dump data. I can run reports such as a "detailed table" report for individual tables I'm interested in, which lists the fields in the table etc. Is that a .df file, or do I need to be doing something different?
 
Start a session with the -rx option and Data Administration will allow you to dump data definition files.

Run-time license allows to compile the xcoded sources. "Encrypted Compiler Mode" = "the -rx client startup parameter". Do you have xcode.exe in DLC directory?

Update: I can be wrong but xcode seems to be a free license now. And the utility is consistent between the versions. I guess it's legal to share the xcode executable.
 
Start a session with the -rx option and Data Administration will allow you to dump data definition files.

Run-time license allows to compile the xcoded sources. "Encrypted Compiler Mode" = "the -rx client startup parameter". Do you have xcode.exe in DLC directory?

Update: I can be wrong but xcode seems to be a free license now. And the utility is consistent between the versions. I guess it's legal to share the xcode executable.
George,

Thanks again, your response is extremely helpful :-)

I don't have xcode.exe in my DLC folder. However, starting Data Administration with the -rx command line switch does enable the production of a .df file for the tables I'm interested in. As you suggest, it does change the behaviour of the compiler from "I won't compile anything" to "I can only compile encrypted programs". Presumably xcode.exe, if I had it, would encrypt my .p file into a .x file that I could compile with the Application Compiler tool running in -rx mode?

Many thanks for your support,
Tony
 
Presumably xcode.exe, if I had it, would encrypt my .p file into a .x file
Exactly.

xcode is a part of any development version of OpenEdge. I guess the evaluation versions include it as well.
The size of the file:
Code:
10.2B  47,456
11.7   54,472
12.2   54,984
12.8   57,368
It's not exactly the same but I still believe the xcode.exe works for all versions.
 
I dusted off an old version of DataDigger and got it running in v9. Don't ask why I keep a copy of v9 :)
Check the repo at GitHub - patrickTingen/DataDiggerV9: DataDigger for Progress v9 or simply download it here
It contains both source files and .r files for v9.1D

Extract the zip in c:\temp\dd9 and create a shortcut to your prowin32.exe
Provide the databasename in the shortcut with -db etc and add "-pf datadigger.pf" (check that one for additional settings)
With a little bit of luck an old version of DD will start, looking like this:

1745570813607.png

Select a table and choose the "Dump" button below. Option csv is not yet in this version, but you can export to Excel
Be prepared to encounter some errors here and there, if you need any more help, just let me know
 
I dusted off an old version of DataDigger and got it running in v9. Don't ask why I keep a copy of v9 :) ....
Patrick, Thank you very much for this!

I was able to launch DataDigger on my environment with the following command in a Proenv window:
Code:
prowin32 -db [database name] -H localhost -S [port] -rx -p c:\temp\dd9\datadigger.pf
The tool is extremely helpful and although I had some issues (didn't have Excel installed for Excel dump) I was able to investigate my data and dump to .d files. I'm sure if I had persevered and got Excel installed, I could have done this too.
 
I've been able to successfully export all the data I needed from my legacy Progress 91.D database which has a run-time only licence (e.g. does not have a developer licence installed). Many thanks to support from George Potemkin, Patrick Tingen and Darker @ Progress Tools website. I've written this reply to set out all required steps for a complete novice (like me) to be able to follow suit if they have the same problem. This approach is based on standard Progress tools, before Patrick posted links to his 9.1D compatible DataDigger tool.

With the help of ChatGPT I was able to craft a .p script which will dump a table to .csv. (attached as "export-csv(not encoded).p"). The difficulty was compiling this without a developer licence.

A few options here are to buy a developer licence, or find a friendly Progress Dev (who you trust), who is able to use the Progress xcode app to encrypt the script. My encrypted script (export-csv.p) is attached too. It does seem that the xcode app from later versions of Progress OpenEdge does generate encrypted scripts which are still compatible with v9.1D. It should be noted that xcode is part of the OpenEdge development kit and is not freely distributable software. If you do find it installed on your server however, you can use it with the original source code, without having to trust that my encrypted version is what it claims to be lol. An alternative, for the more cautious, would be to compile my encrypted script and then use the Progress Decompiler service on the Progress Tools website to check that it really is what it claims to be. :)

Once you have an encrypted version of the script, this can be compiled without a developer licence. First launch Prowin32, connected to your database with the -rx flag to allow extensions such as compiling encrypted scripts and dumping table definition .df files. Running the following from a Proenv window works for me:
Code:
prowin32 -db [databasename] -H localhost -S [port] -rx -p _edit.p
From the Procedure Editor that this launches, there are various tools which are super useful for investigating your data:
  • Tools > Data Dictionary - allows you to view tables and fields.
    • Database > Reports > Detailed Table - dumps a text description of a table
    • Database > Reports > Table relations > Selected table - allows you to see how tables are joined
  • Tools > Data Administration - allows you to dump .df table definition files
    • Admin > Dump Data and Definitions - creates a .df file for a specific table, or for all tables. Lists information about the table including plain English name and description for the table's contents (assuming the original DBA labelled them all nicely). Also lists all fields in a table with plain English names and descriptions too.
  • Tools > Application Compiler - allows compilation of encrypted .p scripts into executable .r run-time files.
I used the Application Compiler to compile my encrypted script and save the .r file to disk. The compiled script can be invoked with:
Code:
prowin32 -db [database name] -H localhost -S [port] -p c:\temp\export-csv.r -param "[tablename],c:\temp\[output-filename].csv"

One other tip I would give is that the data table definition files (.df) which can be created in the Data Administration tool (when running with the -rx switch) can be converted to .csv files themselves. I found that uploading the file to either ChatGPT or CoPilot with a prompt like "Hi there, I've uploaded a Progress 9.1D .df table definition file, can you convert it into a .csv file please with one row for each field in the table and columns for Field Name, Label, Description, Help and Format please?" yielded positive results. The output from this was extremely helpful for understanding the data I had exported, and labelling all the columns in the Excel files I was creating from the .csv files.

One final point, my script is not perfect, I found a few corrupt lines of data in the .csv files, which I suspect are caused by records with quote marks or other special characters in text fields. For me it wasn't a huge problem, just a few records out of millions I exported.

Apologies for the length of this message, I hope one-day it helps someone else with a legacy 9.1D database they need to extract data from. If you're prepared to trust my attached encrypted script, you should have everything you need to dump tables to .csv files.

Best wishes,
Tony
 

Attachments

Back
Top