I understand the concept--I've even done dump and loads. But I've always used scripts provided by someone else. I'm looking at the script that I used last, and it's pretty straight forward--a "dumpdb" function that gets the table name and the dump folder, which then runs _proutil db -C dump and checks for any errors. The bulk of the script is running the dumpdb function on every table in the database. The load script is essentially the same thing.
It's this list of tables that concerns me. It might be accurate for version x, but not for version x+1. Or we may have a modification that added a table. I'm curious how the pros do this. I'm also curious about system tables--our last D&L required me to run a user sync script because the D&L scripts didn't dump out the _users table. Infor has a script that rebuilds it (makes sense, since their D&L scripts didn't include it). Any reason not to include this? Are there other system table that should be included?
It's this list of tables that concerns me. It might be accurate for version x, but not for version x+1. Or we may have a modification that added a table. I'm curious how the pros do this. I'm also curious about system tables--our last D&L required me to run a user sync script because the D&L scripts didn't dump out the _users table. Infor has a script that rebuilds it (makes sense, since their D&L scripts didn't include it). Any reason not to include this? Are there other system table that should be included?