Resolved CRC Issue after Dump and Load

bob

New Member
Hi,
- OpenEdge 10.1B03
- Windows Server 2003

I've come across an issue with CRCs when doing a Dump and Load which I'm hoping somebody might be able to help me with.
I'm trying to Dump data from an OpenEdge 10.1B03 Enterprise database and Load it into an OpenEdge 10.1B03 Workgroup database, both running on Windows Server 2003. The idea is to create a local copy of the client database to use for some testing.

The Enterprise database has large file support enabled. The Enterprise database is on a client site which we only run r code on.
We create the compiled code against Workgroup edition databases in our own dev environment.

I've created a new database locally using the Workgroup edition using a .df exported from the Client site.
The r code we currently have was compiled against an old copy of the database running on the Workgroup edition.
The schemas for the databases are the same, although our old copy of the database was originally using a different code-page. All the databases use the same blocksize (4096).

I've successfully uploaded the .ds which I dumped from the Data Administration tool off the Enterprise db into the new Workgroup db. I only dumped the application tables from the Enterprise database, I didn't export any of the underscore tables.

Now when I try to run any of the existing .r code against the new Workgroup database I get a CRC error saying the CRC for the r code and the database don't match (Err Code: 1896).

I'm wondering:

  • How does Progress work out the CRC for a database? I thought that once the schemas were the same that I shouldn't run into any problems, but I'm guessing that's not the case...
  • Is the codepage an issue when calculating CRCs?
  • Is the blocksize an issue?
  • Is it possible to move data between Enterprise and Workgroup editions in this way, or is my approach wrong?

Thanks in advance for your help.
 

Rob Fitzpatrick

ProgressTalk.com Sponsor
I think this KB article will help you:
What dictionary changes for recompiling? (time stamp and CRC)

I'm trying to Dump data from an OpenEdge 10.1B03 Enterprise database and Load it into an OpenEdge 10.1B03 Workgroup database...
I don't know why you're messing around with the Workgroup database. Is it possible to move data from Enterprise to Workgroup? Yes, but it's fraught with potential problems, and from experience I recommend you treat it as a non-option; you'll just regret it.

We create the compiled code against Workgroup edition databases in our own dev environment.
So I assume you have source access.

I've created a new database locally using the Workgroup edition using a .df exported from the Client site.
The r code we currently have was compiled against an old copy of the database running on the Workgroup edition.
For the DB where you are trying unsuccessfully to run your object code, the DB application schema comes from a client site and the r-code was compiled against a different, and old, dev DB? Why use old r-code that's not working? You have the source, you have the client's schema, you have a dev DB created from that schema, and you have a compiler. I would create a new empty compile database with the client's schema, and keep its schema in lock-step with theirs. Then compile your source against that compile DB and run that r-code in dev. What will that cost you, maybe a couple of hours?

I'm wondering:


  • How does Progress work out the CRC for a database? I thought that once the schemas were the same that I shouldn't run into any problems, but I'm guessing that's not the case...
  • Is the codepage an issue when calculating CRCs?
That is addressed in the linked KB article.

  • Is the blocksize an issue?
No.

  • Is it possible to move data between Enterprise and Workgroup editions in this way, or is my approach wrong?
Yes, but again, I wouldn't touch that approach with a ten-foot pole.

One other thing to consider: one of your databases was freshly created in 10.1B03, but you didn't mention the origin of the other. Was it created in 10.1B, or upgraded? Is it possible that the meta-schemas are not the same? Have you run proutil updatevst and proutil updateschema?
 

bob

New Member

Hi Rob,

Thanks for getting back to me and for the link.


I don't know why you're messing around with the Workgroup database. Is it possible to move data from Enterprise to Workgroup? Yes, but it's fraught with potential problems, and from experience I recommend you treat it as a non-option; you'll just regret it.


So I assume you have source access.

I do have access to the source. Unfortunately we only have a Workgroup license locally for development work at the moment.

For the DB where you are trying unsuccessfully to run your object code, the DB application schema comes from a client site and the r-code was compiled against a different, and old, dev DB?

That's right. The schema from the client site and the dev DBs (old and new) are all the same.


Why use old r-code that's not working? You have the source, you have the client's schema, you have a dev DB created from that schema, and you have a compiler. I would create a new empty compile database with the client's schema, and keep its schema in lock-step with theirs. Then compile your source against that compile DB and run that r-code in dev. What will that cost you, maybe a couple of hours?
My concern was that if I compile new programs against my local dev database and ship them to the client, then they'll get CRC errors on their side. I might give your empty compile database approach a shot instead.


That is addressed in the linked KB article.

No.


Yes, but again, I wouldn't touch that approach with a ten-foot pole.

One other thing to consider: one of your databases was freshly created in 10.1B03, but you didn't mention the origin of the other. Was it created in 10.1B, or upgraded? Is it possible that the meta-schemas are not the same? Have you run proutil updatevst and proutil updateschema?

I'm not 100% certain on the history of the client's enterprise database, but I'm pretty sure it was upgraded. It might have even started life as a V9 database. I don't think proutil updatevst and proutil updateschema were run against the client database.
 

bob

New Member
Hi Rob,

Thanks for getting back to me.

The client databases started out life as V9 databases and were upgraded over time. I don't think proutil updatevst or proutil updateschema have been run against them. I've played around with this a little more and I think you're right, there is a difference in the meta-schema somewhere. I was able to work around this in the end by just using the old development database, deleting all application records out of it and then uploading the new .d files from the client site using the bulk loader.

Best Regards,
- Bob
 
Top