Dumping data into CSV format (urgent) ...

vinsym

New Member
Hi guys,

I'm quite new to Progress. I've been assigned a task of extracting data from an application that runs on progress 7.3B (i presume). The licenses in relation to progress database, that i see on the server hosting this application are as follows:

- Network DB Server (7.3B)
- Client Networking (7.3E) &
- Query/RESULTS (7.3E)


Is there someway by which i could dump data into files of CSV format?

Thanks in advance
Vinod
 

sekarv018

New Member
Hi,

You can use the export option to dump the records into a csv format

Def stream s1.
output stream s1 to "pathname/output.tpe" .

for each tablename no-lock:
export stream s1 delimiter "," tablename .
end.
output stream s1 close.

Here pathname is the output path as to where you want to export the details.

Please note i have used output.tpe as the output file. You can change it depending on the number of records you want to export. If less than 60k then you can very well use a csv itself.

Regards
Sekar
 

vinsym

New Member
Sekar,

That doesn't seem to work. If i copy/paste the code snippet you had sent on to the procedure editor and try executing the same, i get the following message as output:

"This version of PROGRESS compiles only encrypted programs. (1086)"

Not sure on how this could be handled ...

By the way, this is the command i used in invoking the editor ...

_prowin.exe -rx -p _edit.pc


Thanks
Vinod
 

TomBascom

Curmudgeon
The -rx parameter is causing the message about encrypted programs.

To start Progress you should probably just use _prowin with no arguments.

You have a QUERY license so you should be able to compile and run data extract programs. You should also be able to use the data dictionary to dump data.

I don't know why 60k would matter -- the program that sekarv018 posted will output any size file. The only limitation is any OS limit (like 2GB).
 

vinsym

New Member
Wow, that worked :) Thanks guys - Adding on to what sekar says - 60K makes sense when we have more that 64k rows extracted on to csv format and we require MS Excel to open the file - that wouldn't work. Alright i can live with that - I would be using notepad to look at the data extract.

Further, how do we write a generic script in order to extract data from all tables present within a database.i did happen to looks at couple of them - but doesn't seem to work staright.

Thanks in advance
V
 

vinsym

New Member
Alrighty, I got one bit of code working and now i could extract data on to text files.

Going a step further, is it possible to create a data dump file using the query tool so that i could just create another database in the latest version of OpenEdge and the port it directly to a SQL server by any chance?

One more thing to add, if i get on to the Data Administration tool, the option of dumping the definition is enabled, but the option of dumping the data is disabled.

Any leads?

Thanks in advance for all the help @progresstalk - you guys are wonderful.

V
 

TomBascom

Curmudgeon
If you're going to load the df file into OpenEdge why would you load the data into SQL Server?

Progress and OpenEdge expect ASCII dump files to be space delimited. But other than that it would work fine.

Dumping data should work for the Query license. Maybe it's problem with 7.3 -- I don't know, it's been a while and I don't have it handy anymore. More likely you just started with -rx again.

But... if you're going to dump from Progress 7.3 and load into OpenEdge just do a binary dump. That bypasses such silliness. Or use the utilities provided with OpenEdge to convert the database. (Look in $DLC/bin/83dbutils and $DLC/bin/91dbutils for the the *convXY utilities.) (Practice on a copy of the db...)
 

vinsym

New Member
Thanks for the reply tom. Ok - back to the beginning - the database in subject is as good as 10 years old and long forgotten. But now the requirement is to do some BI Analysis on this 10 year old stuff. I myself am an oracle pro and this task had come my way as this became the top priority and had to be done immediately. The company that sold this application does not exist anymore and that made things worse.

The target was to dump the complete thing in to CSV but after dumping them I see some tables as good around 600 M so that makes it impossible to open them on normal tools. So the next other option was to port them on a Evaluation version of Open edge and then port them to SQL server for which I have ample licenses. Once ported on to SQL server, the Database has to be shipped to another geographical area – so that is the target. Bit tied up over here - all leads are welcome

Thanks
V
 

TomBascom

Curmudgeon
I think I'd just load the CSV's into SQL Server.

Of course maybe I'm being optimistic thinking that SQL Server can read CSV files :rolleyes: After all, I'm a Progress guy ;)
 

vinsym

New Member
Yes tom that would work :) but i might have to start creating around 200 dummy tables in SQL server before i could load them :) thats a bit of work :) And the dumps that i've extracted so far don't have the column headers in place - thats another thing that has to be sorted out :) Phew - thats a lot of work :)

Thanks Tom

V
 

TomBascom

Curmudgeon
Column headers:

Code:
for each _file no-lock where not _hidden:
  output to value( _dump-name + ".hdr" ).
  for each _field no-lock of _file break by _order:
    if not last( _order ) then
      put _field-name + ", ".
     else
      put _field-name skip.
  end.
  output close.
end.
 

vinsym

New Member
<< Haven't been able to create a dump file with field labels or names and data in it. Phew :( Any clues?

Thanks V
 

TomBascom

Curmudgeon
Code:
find _file no-lock where _file-name = "Customer":
output to value( _dump-name + ".hdr" ).
for each _field no-lock of _file break by _order:
  if not last( _order ) then
    put _field-name + ", ".
   else
    put _field-name skip.
end.
for each customer no-lock:
  export customer delimiter ",".
end.
output close.

Tested in Firefox :rolleyes:
 

vinsym

New Member
I have a feeling that am bother you a lot Tom :)

** Line 1. Colon followed by white space terminates a statement. (199)

What does that mean? Am getthing error when i execute the code bit on the procedure editor!

Thanks
V
 

vinsym

New Member
Thanks for the reply ram. That works.

Just one final thing - how do i use this code to dump all tables from a database - i mean how do i generlize this code.

Thanks in Advance
V
 

TomBascom

Curmudgeon
You can write a program to write the programs or you can use "compile on the fly" and pass the table name to a generic program.

Those are your only options with such an ancient version of Progress.
 

TomBascom

Curmudgeon
I have a feeling that am bother you a lot Tom :)

No, if you bothered me a lot I wouldn't be responding.

** Line 1. Colon followed by white space terminates a statement. (199)

What does that mean? Am getthing error when i execute the code bit on the procedure editor!

Thanks
V

Oops :blush: It means that FireFox isn't a very good Progress 4GL testing tool.

I see the issue has already been resolved though. So I'll not bother to say that you just need to change the ":" to a ".". :awink:
 

Aigy

New Member
hello,
i need to do the same task: export all tables in csv (seperated files with tablename as filename) with column-names as header.
i managed to do it with one table (using the code which was posted earlier on), but i'm not able to generalize the code.

the problem is, that i am not able to pass the tablename as a stringvalue with the "export stream s1 delimter "," costumer" statement.

help would be very much appreciated! thanks in advance.

ps: i am using openedge 10.1A
 
Top