Trying to resolve Input Data Too Large error

I'm trying to troubleshoot Webspeed code that loads an item catalog from a .csv that has been placed on a server. If the .csv code is greater than about 850K and 5,000 records, the program stalls and I see the following message in the server log:
P-064472 T-1751873344 1 WS -- (Procedure: 'ad/ad_mnt_itemupload.html' Line:2418) SYSTEM ERROR: Input data too large, try to increase -s or/and -stsh. (63)

The offending code is pasted below; can anyone give any advice on:
1. How to determine what values to use for -s (currently set to 32768) and for -stsh?
2. Where to make those changes?

Code:
        input from value(kFileToUpload).

        /*get rid of first line, column titles*/
        import unformatted junk.
      repeat on error undo, leave:
          create tt_pt_mstr.

          IMPORT DELIMITER "," tt_pt_mstr.
      end.
        input close.
Any help would be greatly appreciated!

Thanks,

-Dan
 

Rob Fitzpatrick

ProgressTalk.com Sponsor
I can't help you with WebSpeed code. But if my stack size was 32 MB (enormous) and still too small, I'd be going back to the drawing board on the design.
 

TomBascom

Curmudgeon
I am suspicious that an input line is simply too large.

There is a maximum size record that IMPORT can deal with. I seem to recall that it is 3,000 char per field. I would log the line# as your code runs and then check the input file at the point where is fails.
 

Cringer

ProgressTalk.com Moderator
Staff member
You can try with using LOBs if your Progress version is new enough (it should be!), although that might have memory issues so be careful.

Define a LONGCHAR variable and COPY-LOB your .csv into it, instead of reading it line by line. Then you can iterate on the data using the line end character (varies depending on your OS) to get each line of it. Then iterate the line with the comma delimiter.

I've found in the past that this has quite a few advantages over import.
 
Top