OK, Here is a situation that I encounter frequently and have never been able to get a straight answer to.
Background:
A client application is writing records to table x. At the same time, a batch process which cycles every 5 minutes is reading records from table x utilizing the progress "For Each.." statement. It uses table x to update other database tables. When the other table updates are complete, the batch process flags the records in table x as processed.
When the "For Each" statement finishes, a 2nd "For Each" executes to delete any records that have been processed.
My question is this: When is the scope the set of records to be read by the "For Each" statement in the batch process determined? If at the begining of the batch process, table x contains 1000 records, and during record processing, 100 more are added, should the "For Each" statement proess all 1100 records? Only the original 1000? I've seen a few things happen with this:
1) All 1100 are processed.
2) The original 1000 are processed, leaveing the 100 new records for the next run.
3) The original 1000 are processed and some of the 100 are processed, leaving some of the new records for the next run.
Their doesn't seem to be any consistency to this. Outside of an obvious solution like a record counter, Is there any other way to control the scoping??
Sorry if this got a little long,
-Ed
Background:
A client application is writing records to table x. At the same time, a batch process which cycles every 5 minutes is reading records from table x utilizing the progress "For Each.." statement. It uses table x to update other database tables. When the other table updates are complete, the batch process flags the records in table x as processed.
When the "For Each" statement finishes, a 2nd "For Each" executes to delete any records that have been processed.
My question is this: When is the scope the set of records to be read by the "For Each" statement in the batch process determined? If at the begining of the batch process, table x contains 1000 records, and during record processing, 100 more are added, should the "For Each" statement proess all 1100 records? Only the original 1000? I've seen a few things happen with this:
1) All 1100 are processed.
2) The original 1000 are processed, leaveing the 100 new records for the next run.
3) The original 1000 are processed and some of the 100 are processed, leaving some of the new records for the next run.
Their doesn't seem to be any consistency to this. Outside of an obvious solution like a record counter, Is there any other way to control the scoping??
Sorry if this got a little long,
-Ed