1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.
  2. Hello Guest and welcome! Things may look a bit different today on ProgressTalk.com. In order to ensure we can provide the best user experience, both now and in the future, we have converted to new forum software called XenForo
  3. Checkout our newest feature: ProgressTalk.com Resources.
  4. Tax day is behind us...

    But the PUG Challenge Early Bird discount is still available!

    You may already get a lot of direct attention from Progress. But even so it is hard to bring that attention down into the trenches and get the word out to everyone in the cubicles. We feel that this conference is, by far, the best training opportunity available in the Progress community! We're all about helping you to maximize the OpenEdge technology that you actually have today while preparing you with the knowledge you will need for tomorrow's challenges.

    Register today at http://pugchallenge.org/register.html

Trying to dump database to csv file

Discussion in 'Database Admin' started by zbryanz, Jun 3, 2006.

  1. zbryanz

    zbryanz New Member

    Hi folks,
    I am Bryan. I am a systems administrator with little
    database knowledge beyond MS Access, so please forgive
    my ignorance.

    My customer is switching from Intergy to another practice
    management software. I need to dump her data to csv file.
    There are a few gigs of data. I have been reading your other threads
    and have tried a few of the items on the list. I went to data administration and connected to the database. The admin
    menu at the top is greyed out. I tried going to another tool
    and it said something to the effect: due to my database version,
    it will be read only. I found the database and the bin directory. I have
    successfully proshut the database. I just need to dump it, nothing fancy.

    I would like it in a readable format since she will be without a replacement database for a couple months.

    Oh yeah, if someone could also share with me the procedure to restart
    the database without rebooting, that would be wonderful.

    I do not know the logical name of the database. Not even sure that is important, or how to find it.

    Progress version 9.1D

    Ok, I started the data admin tool with the -rx option. I am still at a loss on how to get a csv file from all the data.
    I need table info and links maintained if possible.

    Thanks for your help,
    Bryan
  2. joey.jeremiah

    joey.jeremiah <b>ProgressTalk Moderator</b> Staff Member

    hope this helps. i don't have 9.1d installed on this machine, so its not tested.


    after unzipping the files

    first you'll need to compile, its just one file, dump.p. the file is xcoded so
    you'll be able to compile even if you haven't got a dev license.

    you'll find the unencrypted source in the src directory, incase you're
    wondering, i promise it won't format your harddrive :)


    to compile, run proenv from your progress menu, cd to the unzipped files dir
    and type the following

    bpro -p compile.p -rx

    to run the proc and spew all your data, type

    bpro -p dump.p -db c:\myddb

    and just watch the csv files pile up

    Attached Files:

  3. zbryanz

    zbryanz New Member

    zip file corrupted

    Joey,

    I apreciate your help. The zip file is corrupted. I tried pkzipfix, but
    was unable to repair it. Can you resubmit the file?

    Thanks,
    Bryan
  4. zbryanz

    zbryanz New Member

    Where does this put the csv files?

    Joey,

    In the above procedure, I do not see an option
    to direct the output to a specific location.

    I need to dump to a different drive due to space limitations.

    Am I missing something?
  5. joey.jeremiah

    joey.jeremiah <b>ProgressTalk Moderator</b> Staff Member

    my mistake, i may have over simplified. when i get home i'll make some
    changes.

    for now after you've compiled (did it work?) cd to the desired drive and dir
    and specify the dump.p proc fullpath


    enter the command prompt with proenv (basically sets some environment
    vars like dlc etc.) change dir to the target drive, dir and type

    z:\dump> bpro -p c:\progress\wrk\dump.p -db x:\mydb


    i'd try it first on sports2000 or some other small database. it's hard to debug
    if every try takes too long.

    btw what os are you working on?
  6. zbryanz

    zbryanz New Member

    error

    Joey,

    I tried to run it against a demo database and it returned this error:
    You are not allowed to connect to a database in $DLC. (1379)

    Nevermind.. I looked up the error. Had to copy the database out of the
    $DLC directory..

    I cannot try their running database until this evening because
    they are using it.

    This solution worked great for me Joey. I apreciate your help..
    It even dumped the field names at the top of each file! You are awesome!


    Thanks,
    Bryan
  7. joey.jeremiah

    joey.jeremiah <b>ProgressTalk Moderator</b> Staff Member

  8. joey.jeremiah

    joey.jeremiah <b>ProgressTalk Moderator</b> Staff Member

    yep. you'll need to make a copy to some other directory. try this,
    copy %dlc%\sports2000*.* c:\progress\wrk

    so i understand the proc compiled under v9, right ?


    the proc can run while the database is online, after all it's just a query.
    but there will be some performance impact


    use mbpro instead of bpro, with the dump proc. bpro equals batch client
    and mbpro equals multiuser batch client

    actually all it is, is just one small parameter, with or without the -1
    single-user mode

    maybe also add a little bit of -Bp 50 to the command line. basically it
    prevents the proc from monopolizing shared memory


    something like this -
    z:\dump> mbpro -p c:\progress\wrk\dump.p -db x:\mydb -Bp 50

    i presume (hope) you're running it on the server on the same machine
    where the database is on (performance reasons)

    but if you must run it from a remote client you'll also need the -H -S
    network connection parameters your clients use


    yes. it dumps into the current directory you are on in the command prompt,
    try it with sports2000.
  9. joey.jeremiah

    joey.jeremiah <b>ProgressTalk Moderator</b> Staff Member

    :blush:


    if you ever need to do something bigger think kindly of me

    i'm cheap and quick (thats what my girlfriend says)
  10. zbryanz

    zbryanz New Member

    Payment

    Joey,

    I will most likely need your services.
    I am going to need to import the data back into
    another EHR program. Maybe a free one.

    Did the process you wrote also dump hidden tables?

    I would be glad to toss you $100 for the time you've already spent.
    I am just starting out in business and do not have much to offer
    yet, but I'm sure when it comes time to reload, I will need you more
    extensively. I guess I'm trying to say; don't be insulted at the $100 offer, your help was infinately more valuable to me than that. That is just what I can afford right now.

    Send me some info and I'll cut you a check.

    Thanks,
    Bryan
  11. joey.jeremiah

    joey.jeremiah <b>ProgressTalk Moderator</b> Staff Member

    if we'll meet up, buy me a beer and we'll call it even :)


    i've made a few more changes, had another look at the proc today and
    fixed a glitch in the array export routine

    i've used the -T param to specify the target drive and dir (note: the
    params are case-sensitive)

    and -param to specify the file query where clause, the default is now
    "where not _hidden or _file-name = '_user'"

    for example -
    bpro -p dump.p -db x:\mydb -T z:\dump -param "where _file-name = 'item'"


    heres a short recap

    to compile -
    bpro -p compile.p -rx

    dump while database is offline
    bpro -p dump.p -db x:\mydb -T z:\dump

    dump while database is online
    mbpro -p dump.p -db x:\mydb -T z:\dump -Bp 50


    xml, pdf, office or open office docs with office xml schemas or odf etc.
    are also doable.

    its been fun, hope to talk to you again

    Attached Files:

  12. zbryanz

    zbryanz New Member

    Dta tainted?

    Joey,

    When you say there was a glitch, does that mean my previous
    dump was not good? Please elaborate.

    What does the change you made do?
    "where not _hidden or _file-name = '_user'"

    Does that dump hidden also?

    What does the _user mean?
    The "where not_hidden" part concerns me. Does that mean no
    hidden tables are dumped?

    Thanks,
    Bryan
  13. joey.jeremiah

    joey.jeremiah <b>ProgressTalk Moderator</b> Staff Member

    the code that dumps field values (just three lines) dumped arrays without
    comma separators

    something like this -
    1 2 3 instead of 1,2,3


    you can run the dump while the database is online and users are working.

    there will be a reasonable performance impact. it wont bring the database to a
    stop, especially, if you're dumping to a separate drive

    how long did the dump take ? 15, 30 minutes maybe an hour ?

    of course there could be big diff in performance with databases that haven't
    been tuned for a while



    the "where clause" on the db tables query ( called files in the schema )
    lets you specify a filter, say, you only wanted to dump a specific table

    for example -
    "where _file-name = '<put table name here>' "

    the database system tables ( like schema, status info etc. ) are hidden ( _hidden table field )
    and the _user system table holds users account info

    if you do have hidden tables you might want try this instead, just to be sure
    "where _file-num > 0 and _file-num < 32767 or _file-name = '_user' "

    which means only your application not system tables and the user accounts
  14. zbryanz

    zbryanz New Member

    Joey,

    Thanks, I will try that tonight.

    I was looking through the cvs files and came accross this...

    "FileArchiveSID","ExtentCount","RawData","TimeStamp","LastUser","CreateStamp","CreateUser"
    1,1,
    Incompatible datatypes found during runtime conversion. (5729)
    ** Unable to evaluate expression for PUT statement. (564)
    4,1,
    Incompatible datatypes found during runtime conversion. (5729)
    ** Unable to evaluate expression for PUT statement. (564)
    6,1,
    Incompatible datatypes found during runtime conversion. (5729)
    ** Unable to evaluate expression for PUT statement. (564)
    8,1,
    Incompatible datatypes found during runtime conversion. (5729)
    ** Unable to evaluate expression for PUT statement. (564)
    10,1,

    What does this mean?

    Thanks,
    Bryan
  15. joey.jeremiah

    joey.jeremiah <b>ProgressTalk Moderator</b> Staff Member


    ouch.


    raw data is just unconverted data (bytes), for example, it can be used to
    store binary data.

    in this case, looks like its used to store a record.

    the proc now converts raw and other funny data types to strings.
    so binary data is converted to base64.


    btw dont just throw away your database. for one you'll still need for the
    schema.

    you also got to have backups (i'm crazy about backups). you can just
    shutdown the database and make one or two dvd copies, of everything.

    Attached Files:

  16. yzg

    yzg New Member

    I tried the dump as explained here and it worked fine when I did it on the database server.
    Now, however, I need to do this from a machine on the network that is not the database server. I tried adding the -H and -S parameters and it didn't work.

    Here is what I tried:
    mbpro medman -p dump.p -Bp 50 -H 192.168.20.2 -S Intergy-DB -param "where _file-name = 'Practice'"

    Here are the results:
    _file-num must be a quoted constant or an unabbreviated, unambiguous buffer/fi
    eld reference for buffers known to query . (7328)
    QUERY-OPEN for query requires a previous QUERY-PREPARE. (7312)
    Cannot run GET methods on query until it is opened. (7313)


    Can someone give me some pointers?
  17. bisontech

    bisontech New Member

    Re: Trying to dump database to xml?

    I am looking to export data from Intergy. Do they have a export to xml function?
  18. cj_brandt

    cj_brandt Member

    Does your site use the Practice Analytics software ? That process dumps most of the data from Intergy and puts it into csv files. Would make your task a bit easier.
  19. bisontech

    bisontech New Member

    Thanks I am looking into that. Is it part of Intergy or a separate program?


    Leonard
  20. MaximMonin

    MaximMonin New Member

    Here is code to export any table to xml:
    Code (progress):
    1.  
    2. GetTabledata.p
    3.  
    4. define input parameter TableName as character.
    5. define input parameter filtertext as character.
    6. define input parameter sortby as character.
    7.  
    8. define output parameter TABLE-HANDLE TableData.
    9.  
    10. define variable i as integer.
    11. define variable j as integer.
    12. define variable fieldcount as integer.
    13. DEFINE variable hQuery           AS HANDLE NO-UNDO.
    14. DEFINE variable hBuffer          AS HANDLE NO-UNDO.
    15. DEFINE VARIABLE hBufferField     AS HANDLE NO-UNDO.
    16. DEFINE variable hQueryData       AS HANDLE NO-UNDO.
    17. DEFINE variable hBufferData      AS HANDLE NO-UNDO.
    18. DEFINE VARIABLE hBufferFieldData AS HANDLE NO-UNDO.
    19.  
    20. define variable TableNameFull as character.
    21. TableNameFull = "system." + TableName.
    22.  
    23. define variable sqlquery as character.
    24. sqlquery = "FOR EACH " + TableNameFull + " NO-LOCK".
    25. filtertext = trim(filtertext).
    26. if filtertext <> "" then
    27.   sqlquery = sqlquery + " where " + filtertext.
    28. if trim(sortby) <> "" then
    29.   sqlquery = sqlquery + " by " + TableNameFull + "." + sortby.
    30.  
    31. find first system._File where system._File._File-Name = TableName NO-LOCK NO-ERROR.
    32. if not available system._File then RETURN.
    33.  
    34. create temp-table TableData.
    35.  
    36. fieldcount = 0.
    37. FOR EACH system._Field OF system._File NO-LOCK By system._Field._Order:
    38.   TableData:ADD-NEW-FIELD (system._Field._Field-name, system._Field._Data-Type).
    39.   fieldcount = fieldcount + 1.
    40. END.
    41. TableData:TEMP-TABLE-PREPARE ("TableData").
    42.  
    43. CREATE QUERY hQuery.
    44. CREATE BUFFER hBuffer FOR TABLE TableData:DEFAULT-BUFFER-HANDLE.
    45. hQuery:SET-BUFFERS(hBuffer).
    46.  
    47. CREATE QUERY hQueryData.
    48. CREATE BUFFER hBufferData FOR TABLE TableNameFull.
    49. hQueryData:SET-BUFFERS(hBufferData).
    50. hQueryData:QUERY-PREPARE(sqlquery).
    51. hQueryData:QUERY-OPEN.
    52. hQueryData:GET-FIRST ().
    53. repeat:
    54.   if hQueryData:QUERY-OFF-END then leave.
    55.  
    56.   hBuffer:BUFFER-CREATE().
    57.  
    58.   DO j = 1 TO fieldcount:
    59.     hBufferField = hBuffer:BUFFER-FIELD(j).
    60.     hBufferFieldData = hBufferData:BUFFER-FIELD(j).
    61.     hBufferField:BUFFER-VALUE = hBufferFieldData:BUFFER-VALUE.
    62.   END.
    63.   hQueryData:GET-NEXT ().
    64. END.
    65. hQueryData:QUERY-CLOSE().
    66. DELETE OBJECT hQueryData.
    67. hQuery:QUERY-CLOSE().
    68. DELETE OBJECT hQuery.
    69.  
    70. XmlWriteTable.p
    71.  
    72. define input parameter filename as character.
    73. define input parameter TABLE-HANDLE hTableData.
    74.  
    75. DEFINE VARIABLE cTargetType     AS CHARACTER NO-UNDO.
    76. DEFINE VARIABLE cFile           AS CHARACTER NO-UNDO.
    77. DEFINE VARIABLE lFormatted      AS LOGICAL   NO-UNDO.
    78. DEFINE VARIABLE cEncoding       AS CHARACTER NO-UNDO.
    79. DEFINE VARIABLE cSchemaLocation AS CHARACTER NO-UNDO.
    80. DEFINE VARIABLE lWriteSchema    AS LOGICAL   NO-UNDO.
    81. DEFINE VARIABLE lMinSchema      AS LOGICAL   NO-UNDO.
    82. DEFINE VARIABLE retOK           AS LOGICAL   NO-UNDO.
    83.  
    84. ASSIGN  cTargetType     = "file"  
    85. cFile           = filename  
    86. lFormatted      = true
    87. cEncoding       = ?  
    88. cSchemaLocation = ?  
    89. lWriteSchema    = true
    90. lMinSchema      = false.
    91. retOK = hTableData:WRITE-XML ( cTargetType, cFile, lFormatted, cEncoding,  cSchemaLocation, lWriteSchema, lMinSchema).
    92.  
    93. return STRING(retOK).
    94.  
    95. Export table to xml:
    96. define variable hTableData as handle.
    97. run GetTableData.p ("accounts","","", output TABLE-HANDLE hTableData).
    98. run XmlWriteTable.p ("accounts.xml", TABLE-HANDLE hTableData).
    99.  
    100.  
    Result is xml file = soap body packet WSA sends to client. So it is compatible to web services.
Similar Threads
Forum Title Date
PSDN Community Discussion Forum Forum Post: Re: Unable to see tables when trying to create a new report Apr 10, 2014
PSDN Community Discussion Forum Forum Post: RE: Unable to see tables when trying to create a new report Apr 10, 2014
PSDN Community Discussion Forum Forum Post: RE: Unable to see tables when trying to create a new report Apr 10, 2014
PSDN Community Discussion Forum Forum Post: Re: Unable to see tables when trying to create a new report Apr 10, 2014
PSDN Community Discussion Forum Forum Post: Unable to see tables when trying to create a new report Apr 10, 2014

Share This Page