Question Import Data volume increase to 300,000 causing DB updates to almost halt after updating 38000 records - Progress 11.7 Windows client - Linux db server

TomBascom

Curmudgeon
ProGetStack - he Lines that it says are from a procedure inside the program, aren't actually there in that procedure in the source program (.p)

When the process "hangs" you should do a "proGetStack" and obtain a 4gl stack trace for that process. A stack trace from a crashed session is less helpful. That just shows why your session crashed.

Code:
proenv> progetstack 1234

"1234" is the process id.

Do this several times to verify that it is, in fact, "hung" rather than executing in a loop or something like that.

A protrace.1234 file will be created somewhere. Probably in the working directory of the process. That file will contain a 4gl stack trace. The top of the stack trace will identify the line of code that you are "hung" on. To your point above, this a DEBUG LIST line number - not a plain text source line number. To find it you will need to compile the relevant code with the DEBUG LIST option. Like so:

Code:
COMPILE "programname.p" DEBUG-LIST "programname.dbg"

This should then point you at the code that is "hung". Knowing that would go an awful long ways towards identifying the actual source of the problem.
 

Rob Fitzpatrick

ProgressTalk.com Sponsor
It would be interesting to compare how this process runs in your present database, versus an otherwise-identical database that is not TDE-enabled. OE clearly does something different with the client DBI file handling in the two cases. I am wondering if that affects your code.
 

Shail

New Member
srt is large 287Mb - probably because it's filling up the temp-tables with the data from the input file and building the temp-table indexes but don't suspect it would be the sorting of database tables for want of indexes, or the right indexes
 

Shail

New Member
Have obtained the following now :-
a) Debug-List file and protrace.pid# files from the stall point
b) Debug.log from Log-manager
c) 4GLtrace.log ( using -clientLog. -logentrytypes and -loglevel ) - 7gb file, however I will open it !?
time to investigate ..
 

Shail

New Member
got to line of code that was slowing it down - commented it out, though I do need it - but without that line, the program goes on to import 320,000 lines of data !
it appears that there might be a build-up of buffer-size, that MS-Windows does not let the Progress program to get away with, and slows it down.
Also, I noticed some elastic nature, as well - Having imported 320,000 lines with the 'problem line of code' commented out, I was able to import a cut-down version of 100,000 lines of the same data file, even without commenting out the 'problem line' - but when tried the entire data file of 320,000 lines again, it got slowed down at ~50,000 lines.
Any thoughts ?
 

Shail

New Member
when that happens, the slowness is very abrupt - so imports first 50,000 lines at a good speed - and then the speed falls down exponentially and instantaneously
 

Shail

New Member
Thanks, do appreciate all help - the problem-line is a procedure call to update the Email Address in the Update and Create Blocks, as below - there is a call to a custom-library procedure, that has always worked to update/create Email address in the database, until now ... when the volume has gone up.. houston we got a problem!
Commenting this procedure-call brings back the good speed. In fact this procedure, in turn calls other custom-library procedures etc. , that I have yet to get to the bottom of - it's a bit of a maze, although has always worked - been in use for long. Also, I have had to comment the the procedure-call in both 'update and create' blocks - to get the Import to complete the entire volume of 320,000 at 'good' speed - you can only expect so much from a client/server for that volume - obviously have come across a warren .. yet more to investigate !
Code:
/* update-block */
for each ttcustomer where ttcustomer. found:
----
UpdateBLOCK:
DO TRANSACTION ON ERROR UNDO UpdateBLOCK, RETRY UpdateBLOCK:
if retry - then do:
---
end.
hCustomer:FIND-FIRST (SUBSTITUTE ('WHERE Customer.CustomerId = &1', ttCustomer.CustomerId ), EXCLUSIVE-LOCK) NO-ERROR.
---
FOR LAST bEmailRec NO-LOCK
                   WHERE bEmailRec.CustomerId = hCustomer:BUFFER-FIELD ('CustomerId'):BUFFER-VALUE:
                    iRecPtr = bEmailRec.RecPtr.
END.
---
EMAIL-BLOCK:
FOR EACH ttEmailRec
               WHERE ttEmailRec.CustomerId = ttCustomer.CustomerId
                 AND NOT ttEmailRec.ErrorFlag
                     BREAK BY ttEmailRec.CustomerId:  /* why break-by ? - probably for the LAST( ), but won't hurt, only a small table for 1 customer*/
------
RUN Create-CustomerEmail
                        (INPUT-OUTPUT iRecPtr,
                         INPUT-OUTPUT bCustomer.MainEmailRecPtr,
                         INPUT  bCustomer.CustomerId,
                         INPUT  ttEmailRec.EmailAddress,
                         INPUT  ttEmailRec.Description,
                         INPUT  {fn getuserId},
                         INPUT  bImportHeader.DataSourceId,
                         INPUT  LAST (ttEmailRec.CustomerId),
                         BUFFER bEmailRec,
                         OUTPUT po-lSuccess,
                         OUTPUT po-cError).

IF NOT lSuccess THEN
                           UNDO update-block, RETRY update-block.
----
END.
-----
END.

/* create-block */
for each ttcustomer where not ttcustomer.found:
----
NEWBLOCK:
DO TRANSACTION ON ERROR UNDO NEWBLOCK, RETRY NEWBLOCK:
if retry - then do:
-- undo, leave ...
end.
---
FOR EACH ttEmailRec
               WHERE ttEmailRec.CustomerId = ttCustomer.CustomerId
                 AND NOT ttEmailRec.ErrorFlag
                     BREAK BY ttEmailRec.CustomerId:  /* same break-by again */
----
----
/* Create batch when first-of etc */
----
EMAIL-BLOCK:
RUN Create-CustomerEmail
                        (INPUT-OUTPUT iRecPtr,
                         INPUT-OUTPUT bCustomer.MainEmailRecPtr,
                         INPUT  bCustomer.CustomerId,
                         INPUT  ttEmailRec.EmailAddress,
                         INPUT  ttEmailRec.Description,
                         INPUT  {fn getuserId},
                         INPUT  bImportHeader.DataSourceId,
                         INPUT  LAST (ttEmailRec.CustomerId),
                         BUFFER bEmailRec,
                         OUTPUT po-lSuccess,
                         OUTPUT po-cError).
IF NOT lSuccess THEN
                           UNDO create-block, RETRY create-block.
----
END.
----
end.
etc.
 
Last edited by a moderator:

TomBascom

Curmudgeon
Your code sample might be readable, even with the horrid UPPERCASE KEYWORDS and vile Hungarian Notation, if you enclosed it in a CODE tags. But, as is, I can’t be bothered.
 

Shail

New Member
This code doesn't have any issues - the problem lies within the sub-procedure, of which only the call is present.
Even so, thanks for your comments.
 
Hi
I notice several potential performance issue on code like "for each ttcustomer where not ttcustomer.found" .
Even if there is an index on the field found , the request will not use it = > Whole-Index.
You should have "for each ttcustomer where ttcustomer.found = false"

You could have a performance issue on the read of your temp-table , especially if there is not enough room for all the Temp-table records in memory (Bt -tmpbsize), you will do a disk read on the file of your Temp-table


Patrice
 

tamhas

ProgressTalk.com Sponsor
Try putting the code in code tags and adding a comment at the place of the problem line.
 

Stefan

Well-Known Member
Code:
/* why break-by ? - probably for the LAST( ), but won't hurt, only a small table for 1 customer*/
Let's just assume that this is not entirely true. :D
 

Rob Fitzpatrick

ProgressTalk.com Sponsor
Looking at data from the temp-table VSTs after an upload run could be interesting. There is sample code for doing this in the free ProTop download.
 
Top