Trying to dump database to csv file

zbryanz

New Member
Hi folks,
I am Bryan. I am a systems administrator with little
database knowledge beyond MS Access, so please forgive
my ignorance.

My customer is switching from Intergy to another practice
management software. I need to dump her data to csv file.
There are a few gigs of data. I have been reading your other threads
and have tried a few of the items on the list. I went to data administration and connected to the database. The admin
menu at the top is greyed out. I tried going to another tool
and it said something to the effect: due to my database version,
it will be read only. I found the database and the bin directory. I have
successfully proshut the database. I just need to dump it, nothing fancy.

I would like it in a readable format since she will be without a replacement database for a couple months.

Oh yeah, if someone could also share with me the procedure to restart
the database without rebooting, that would be wonderful.

I do not know the logical name of the database. Not even sure that is important, or how to find it.

Progress version 9.1D

Ok, I started the data admin tool with the -rx option. I am still at a loss on how to get a csv file from all the data.
I need table info and links maintained if possible.

Thanks for your help,
Bryan
 

joey.jeremiah

ProgressTalk Moderator
Staff member
hope this helps. i don't have 9.1d installed on this machine, so its not tested.


after unzipping the files

first you'll need to compile, its just one file, dump.p. the file is xcoded so
you'll be able to compile even if you haven't got a dev license.

you'll find the unencrypted source in the src directory, incase you're
wondering, i promise it won't format your harddrive :)


to compile, run proenv from your progress menu, cd to the unzipped files dir
and type the following

bpro -p compile.p -rx

to run the proc and spew all your data, type

bpro -p dump.p -db c:\myddb

and just watch the csv files pile up
 

Attachments

  • dump.zip
    4.7 KB · Views: 112

zbryanz

New Member
zip file corrupted

Joey,

I apreciate your help. The zip file is corrupted. I tried pkzipfix, but
was unable to repair it. Can you resubmit the file?

Thanks,
Bryan
 

zbryanz

New Member
Where does this put the csv files?

Joey,

In the above procedure, I do not see an option
to direct the output to a specific location.

I need to dump to a different drive due to space limitations.

Am I missing something?
 

joey.jeremiah

ProgressTalk Moderator
Staff member
zbryanz said:
Joey,

In the above procedure, I do not see an option
to direct the output to a specific location.

I need to dump to a different drive due to space limitations.

Am I missing something?

my mistake, i may have over simplified. when i get home i'll make some
changes.

for now after you've compiled (did it work?) cd to the desired drive and dir
and specify the dump.p proc fullpath


enter the command prompt with proenv (basically sets some environment
vars like dlc etc.) change dir to the target drive, dir and type

z:\dump> bpro -p c:\progress\wrk\dump.p -db x:\mydb


i'd try it first on sports2000 or some other small database. it's hard to debug
if every try takes too long.

btw what os are you working on?
 

zbryanz

New Member
error

Joey,

I tried to run it against a demo database and it returned this error:
You are not allowed to connect to a database in $DLC. (1379)

Nevermind.. I looked up the error. Had to copy the database out of the
$DLC directory..

I cannot try their running database until this evening because
they are using it.

This solution worked great for me Joey. I apreciate your help..
It even dumped the field names at the top of each file! You are awesome!


Thanks,
Bryan
 

joey.jeremiah

ProgressTalk Moderator
Staff member
zbryanz said:
Joey,
I tried to run it against a demo database and it returned this error:
You are not allowed to connect to a database in $DLC. (1379)
yep. you'll need to make a copy to some other directory. try this,
copy %dlc%\sports2000*.* c:\progress\wrk

so i understand the proc compiled under v9, right ?


zbryanz said:
I cannot try their running database until this evening because
they are using it.
the proc can run while the database is online, after all it's just a query.
but there will be some performance impact


use mbpro instead of bpro, with the dump proc. bpro equals batch client
and mbpro equals multiuser batch client

actually all it is, is just one small parameter, with or without the -1
single-user mode

maybe also add a little bit of -Bp 50 to the command line. basically it
prevents the proc from monopolizing shared memory


something like this -
z:\dump> mbpro -p c:\progress\wrk\dump.p -db x:\mydb -Bp 50

i presume (hope) you're running it on the server on the same machine
where the database is on (performance reasons)

but if you must run it from a remote client you'll also need the -H -S
network connection parameters your clients use


zbryanz said:
If I understand you correctly, it dumps to whatever directory
I run the dump.p from?

Win2k server
Thanks,
Bryan
yes. it dumps into the current directory you are on in the command prompt,
try it with sports2000.
 

joey.jeremiah

ProgressTalk Moderator
Staff member
zbryanz said:
This solution worked great for me Joey. I apreciate your help..
It even dumped the field names at the top of each file! You are awesome!


Thanks,
Bryan

:blush:


if you ever need to do something bigger think kindly of me

i'm cheap and quick (thats what my girlfriend says)
 

zbryanz

New Member
Payment

Joey,

I will most likely need your services.
I am going to need to import the data back into
another EHR program. Maybe a free one.

Did the process you wrote also dump hidden tables?

I would be glad to toss you $100 for the time you've already spent.
I am just starting out in business and do not have much to offer
yet, but I'm sure when it comes time to reload, I will need you more
extensively. I guess I'm trying to say; don't be insulted at the $100 offer, your help was infinately more valuable to me than that. That is just what I can afford right now.

Send me some info and I'll cut you a check.

Thanks,
Bryan
 

joey.jeremiah

ProgressTalk Moderator
Staff member
if we'll meet up, buy me a beer and we'll call it even :)


i've made a few more changes, had another look at the proc today and
fixed a glitch in the array export routine

i've used the -T param to specify the target drive and dir (note: the
params are case-sensitive)

and -param to specify the file query where clause, the default is now
"where not _hidden or _file-name = '_user'"

for example -
bpro -p dump.p -db x:\mydb -T z:\dump -param "where _file-name = 'item'"


heres a short recap

to compile -
bpro -p compile.p -rx

dump while database is offline
bpro -p dump.p -db x:\mydb -T z:\dump

dump while database is online
mbpro -p dump.p -db x:\mydb -T z:\dump -Bp 50


xml, pdf, office or open office docs with office xml schemas or odf etc.
are also doable.

its been fun, hope to talk to you again
 

Attachments

  • dump_final.zip
    5.2 KB · Views: 94

zbryanz

New Member
Dta tainted?

Joey,

When you say there was a glitch, does that mean my previous
dump was not good? Please elaborate.

What does the change you made do?
"where not _hidden or _file-name = '_user'"

Does that dump hidden also?

What does the _user mean?
The "where not_hidden" part concerns me. Does that mean no
hidden tables are dumped?

Thanks,
Bryan
 

joey.jeremiah

ProgressTalk Moderator
Staff member
zbryanz said:
Joey,

When you say there was a glitch, does that mean my previous
dump was not good? Please elaborate.

the code that dumps field values (just three lines) dumped arrays without
comma separators

something like this -
1 2 3 instead of 1,2,3


you can run the dump while the database is online and users are working.

there will be a reasonable performance impact. it wont bring the database to a
stop, especially, if you're dumping to a separate drive

how long did the dump take ? 15, 30 minutes maybe an hour ?

of course there could be big diff in performance with databases that haven't
been tuned for a while

zbryanz said:
What does the change you made do?
"where not _hidden or _file-name = '_user'"



the "where clause" on the db tables query ( called files in the schema )
lets you specify a filter, say, you only wanted to dump a specific table

for example -
"where _file-name = '<put table name here>' "

zbryanz said:
Does that dump hidden also?

What does the _user mean?
The "where not_hidden" part concerns me. Does that mean no
hidden tables are dumped?

Thanks,
Bryan

the database system tables ( like schema, status info etc. ) are hidden ( _hidden table field )
and the _user system table holds users account info

if you do have hidden tables you might want try this instead, just to be sure
"where _file-num > 0 and _file-num < 32767 or _file-name = '_user' "

which means only your application not system tables and the user accounts
 

zbryanz

New Member
Joey,

Thanks, I will try that tonight.

I was looking through the cvs files and came accross this...

"FileArchiveSID","ExtentCount","RawData","TimeStamp","LastUser","CreateStamp","CreateUser"
1,1,
Incompatible datatypes found during runtime conversion. (5729)
** Unable to evaluate expression for PUT statement. (564)
4,1,
Incompatible datatypes found during runtime conversion. (5729)
** Unable to evaluate expression for PUT statement. (564)
6,1,
Incompatible datatypes found during runtime conversion. (5729)
** Unable to evaluate expression for PUT statement. (564)
8,1,
Incompatible datatypes found during runtime conversion. (5729)
** Unable to evaluate expression for PUT statement. (564)
10,1,

What does this mean?

Thanks,
Bryan
 

joey.jeremiah

ProgressTalk Moderator
Staff member
Joey,

Thanks, I will try that tonight.

I was looking through the cvs files and came accross this...

"FileArchiveSID","ExtentCount","RawData","TimeStamp","LastUser","CreateStamp","CreateUser"
1,1,
Incompatible datatypes found during runtime conversion. (5729)
** Unable to evaluate expression for PUT statement. (564)
4,1,
Incompatible datatypes found during runtime conversion. (5729)
** Unable to evaluate expression for PUT statement. (564)
6,1,
Incompatible datatypes found during runtime conversion. (5729)
** Unable to evaluate expression for PUT statement. (564)
8,1,
Incompatible datatypes found during runtime conversion. (5729)
** Unable to evaluate expression for PUT statement. (564)
10,1,

What does this mean?

Thanks,
Bryan


ouch.


raw data is just unconverted data (bytes), for example, it can be used to
store binary data.

in this case, looks like its used to store a record.

the proc now converts raw and other funny data types to strings.
so binary data is converted to base64.


btw dont just throw away your database. for one you'll still need for the
schema.

you also got to have backups (i'm crazy about backups). you can just
shutdown the database and make one or two dvd copies, of everything.
 

Attachments

  • dump.zip
    5.8 KB · Views: 71

yzg

New Member
I tried the dump as explained here and it worked fine when I did it on the database server.
Now, however, I need to do this from a machine on the network that is not the database server. I tried adding the -H and -S parameters and it didn't work.

Here is what I tried:
mbpro medman -p dump.p -Bp 50 -H 192.168.20.2 -S Intergy-DB -param "where _file-name = 'Practice'"

Here are the results:
_file-num must be a quoted constant or an unabbreviated, unambiguous buffer/fi
eld reference for buffers known to query . (7328)
QUERY-OPEN for query requires a previous QUERY-PREPARE. (7312)
Cannot run GET methods on query until it is opened. (7313)


Can someone give me some pointers?
 

bisontech

New Member
Re: Trying to dump database to xml?

I am looking to export data from Intergy. Do they have a export to xml function?
 

cj_brandt

Active Member
Does your site use the Practice Analytics software ? That process dumps most of the data from Intergy and puts it into csv files. Would make your task a bit easier.
 
Here is code to export any table to xml:
Code:
GetTabledata.p

define input parameter TableName as character.
define input parameter filtertext as character.
define input parameter sortby as character.

define output parameter TABLE-HANDLE TableData.

define variable i as integer.
define variable j as integer.
define variable fieldcount as integer.
DEFINE variable hQuery           AS HANDLE NO-UNDO.
DEFINE variable hBuffer          AS HANDLE NO-UNDO.
DEFINE VARIABLE hBufferField     AS HANDLE NO-UNDO.
DEFINE variable hQueryData       AS HANDLE NO-UNDO.
DEFINE variable hBufferData      AS HANDLE NO-UNDO.
DEFINE VARIABLE hBufferFieldData AS HANDLE NO-UNDO.

define variable TableNameFull as character.
TableNameFull = "system." + TableName.

define variable sqlquery as character.
sqlquery = "FOR EACH " + TableNameFull + " NO-LOCK".
filtertext = trim(filtertext).
if filtertext <> "" then
  sqlquery = sqlquery + " where " + filtertext.
if trim(sortby) <> "" then
  sqlquery = sqlquery + " by " + TableNameFull + "." + sortby.

find first system._File where system._File._File-Name = TableName NO-LOCK NO-ERROR.
if not available system._File then RETURN.

create temp-table TableData.

fieldcount = 0.
FOR EACH system._Field OF system._File NO-LOCK By system._Field._Order:
  TableData:ADD-NEW-FIELD (system._Field._Field-name, system._Field._Data-Type). 
  fieldcount = fieldcount + 1.
END.
TableData:TEMP-TABLE-PREPARE ("TableData").

CREATE QUERY hQuery.
CREATE BUFFER hBuffer FOR TABLE TableData:DEFAULT-BUFFER-HANDLE.
hQuery:SET-BUFFERS(hBuffer).

CREATE QUERY hQueryData.
CREATE BUFFER hBufferData FOR TABLE TableNameFull.
hQueryData:SET-BUFFERS(hBufferData).
hQueryData:QUERY-PREPARE(sqlquery).
hQueryData:QUERY-OPEN.
hQueryData:GET-FIRST ().
repeat:
  if hQueryData:QUERY-OFF-END then leave.

  hBuffer:BUFFER-CREATE().

  DO j = 1 TO fieldcount:
    hBufferField = hBuffer:BUFFER-FIELD(j).
    hBufferFieldData = hBufferData:BUFFER-FIELD(j).
    hBufferField:BUFFER-VALUE = hBufferFieldData:BUFFER-VALUE.
  END.
  hQueryData:GET-NEXT ().
END.
hQueryData:QUERY-CLOSE().
DELETE OBJECT hQueryData.
hQuery:QUERY-CLOSE().
DELETE OBJECT hQuery.

XmlWriteTable.p

define input parameter filename as character.
define input parameter TABLE-HANDLE hTableData.

DEFINE VARIABLE cTargetType     AS CHARACTER NO-UNDO.
DEFINE VARIABLE cFile           AS CHARACTER NO-UNDO.
DEFINE VARIABLE lFormatted      AS LOGICAL   NO-UNDO.
DEFINE VARIABLE cEncoding       AS CHARACTER NO-UNDO.
DEFINE VARIABLE cSchemaLocation AS CHARACTER NO-UNDO.
DEFINE VARIABLE lWriteSchema    AS LOGICAL   NO-UNDO.
DEFINE VARIABLE lMinSchema      AS LOGICAL   NO-UNDO.
DEFINE VARIABLE retOK           AS LOGICAL   NO-UNDO.

ASSIGN  cTargetType     = "file"  
cFile           = filename   
lFormatted      = true 
cEncoding       = ?  
cSchemaLocation = ?  
lWriteSchema    = true
lMinSchema      = false. 
retOK = hTableData:WRITE-XML ( cTargetType, cFile, lFormatted, cEncoding,  cSchemaLocation, lWriteSchema, lMinSchema).

return STRING(retOK).

Export table to xml:
define variable hTableData as handle.
run GetTableData.p ("accounts","","", output TABLE-HANDLE hTableData).
run XmlWriteTable.p ("accounts.xml", TABLE-HANDLE hTableData).

Result is xml file = soap body packet WSA sends to client. So it is compatible to web services.
 
Top