The company that I work for has been using a certain ERP package for over 10 years that uses a progress DB. Due to the age of the database and our limited knowledge of Progress, our system has become quite sluggish even after moving to newer and faster servers.
After doing a bit of research it looks like a fair attempt at speeding up our system is to dump & reload our larger data tables-- most of which have a scatter factor of 6% or more and are between .5-1Gb.
I've found the command below for dumping tables
and the follow command for loading
My question is, how could I create a script file and run it from a console where 20, or so, tables are automatically dumped and then reloaded?
My progress experience pretty much doesn't extend past installing the DB engine and restoring back-ups. I'm sure you c an understand why I'd prefer to script the 40 or so commands in one file instead of executing those commands manually.
Thansk guys!
After doing a bit of research it looks like a fair attempt at speeding up our system is to dump & reload our larger data tables-- most of which have a scatter factor of 6% or more and are between .5-1Gb.
I've found the command below for dumping tables
Code:
proutil db-name -C dumpspecified
[owner-name.]table-name.field-name operator field-value
directory
and the follow command for loading
Code:
proutil db-name -C load filename
My question is, how could I create a script file and run it from a console where 20, or so, tables are automatically dumped and then reloaded?
My progress experience pretty much doesn't extend past installing the DB engine and restoring back-ups. I'm sure you c an understand why I'd prefer to script the 40 or so commands in one file instead of executing those commands manually.
Thansk guys!