We are moving from 32-bit Openedge 10.1A running on 32-bit HP-UX 11.11,
to 32-bit Openedge 10.2B running on 32-bit Red Hat Enterprise Linux 6.0.
We are currently running our database spread across several disks using MirrorDisk UX, and will be moving to a RAID 5 configuration in our new environment.
My questions are regarding the DUMP/LOAD:
Current Status:
Application: Character based Forest products Order Entry and Sales application written by PSI (Progressive Solutions) all in progress 4GL,
We have bought the source code and have done extensive customization.
Data Base:
1 Data Base all in schema area.
Size: 40 GB
Tables: 668
Empty Tables: 208 (these table for partes of the purchased software that we do not use).
Tables with over 10,000 Records: 113 occupying 39.5GB
Tables with less than 10,000 records: 555 occupying 500 MB
Need help to choose the best method for dump and load:
1. Should we be concerned about the unused tables? If it will not affect performance we would like to keep them in case we decide to use those areas of the application in the future.
2. Best strategy for dump and load assuming we can have the system unavailable say for 48 and may be 72 hours.
2.1 Custom Dump/Load:
Phase 1: identify the historical records of large tables (i.e. information that can not be changed, for example closed orders, respective shipments and invoices), and custom dump up to a given unique key (say up to order number). And load the historical data to the new DB.
Phase 2: While the system is not available, custom dump small tables and the non-historical data from the large tables (say from order number) and custom load to the new DB.
2.2 Binary Dump/Load
2.3 Character Dump/Load
TIA
to 32-bit Openedge 10.2B running on 32-bit Red Hat Enterprise Linux 6.0.
We are currently running our database spread across several disks using MirrorDisk UX, and will be moving to a RAID 5 configuration in our new environment.
My questions are regarding the DUMP/LOAD:
Current Status:
Application: Character based Forest products Order Entry and Sales application written by PSI (Progressive Solutions) all in progress 4GL,
We have bought the source code and have done extensive customization.
Data Base:
1 Data Base all in schema area.
Size: 40 GB
Tables: 668
Empty Tables: 208 (these table for partes of the purchased software that we do not use).
Tables with over 10,000 Records: 113 occupying 39.5GB
Tables with less than 10,000 records: 555 occupying 500 MB
Need help to choose the best method for dump and load:
1. Should we be concerned about the unused tables? If it will not affect performance we would like to keep them in case we decide to use those areas of the application in the future.
2. Best strategy for dump and load assuming we can have the system unavailable say for 48 and may be 72 hours.
2.1 Custom Dump/Load:
Phase 1: identify the historical records of large tables (i.e. information that can not be changed, for example closed orders, respective shipments and invoices), and custom dump up to a given unique key (say up to order number). And load the historical data to the new DB.
Phase 2: While the system is not available, custom dump small tables and the non-historical data from the large tables (say from order number) and custom load to the new DB.
2.2 Binary Dump/Load
2.3 Character Dump/Load
TIA