Error 303 occured when running a treatment

Sacher

New Member
Hello,

I'm not sure it's the right forum, but i hope somebody could help me.

I run a progress procedure. I'm reading a first file and i use the data of this file to write another file. During the treatment, the following error occured "UNIX maximum file size exceeded. <program>. (303)".

But the size of my input file is only 122 M, and it stops to write the output when the size of the output is 76 M. I don't understand why this error occured and what i can do to fix the treatment in order to write my whole output file.

Thank you for your help.

Marilyn
 
Please provide us with the version of Progress and the flavour of Unix you're using. Also a snippet of your code might help (use the
Code:
tags).
 
Ok.
Sorry i forgot
I used a Progress Version 9.1B, my OS is sun solaris 5.10

My treatment read a file in input and use the data of this file to write another files. The size of the input file is 122 M. During the process the error 303 occured "** UNIX maximum file size exceeded. <program>. (303)" and it's stop the write of the output file.
The size of the output file is only 77M.

Marilyn
 
9.1B is, of course, ancient, obsolete and unsupported. You should upgrade.

You might also want to check the "ulimit -a" command and see if you have a file size limit being imposed on you.

What is a "treatment"?
 
Yes, I know but i didn't decide for the progress version.
I use the ulimit -a and the result is :
-t: cpu time (seconds) unlimited
-f: file size (blocks) unlimited
-d: data seg size (kbytes) unlimited
-s: stack size (kbytes) 8192
-c: core file size (blocks) unlimited
-n: file descriptors 256
-v: virtual memory size (kb) unlimited

By treatment, i mean a .p programm.

Marilyn
 
Two wild guesses:

1) Perhaps the file whose size limit is being exceeded is not the output file, it's some other file that you don't directly know about, such as a temporary sort file or even a database file (you didn't say whether the treatment does any writing other than to the output file). You could use 'find' to hunt for files larger than 1 GB, say, and see if anything recent shows up.

2) Perhaps the file was created when the OS file size was larger than it is today. However, 77M seems like a crazily small maximum file size, even if we rewind to 1975 when v9.1b was still a hot seller and you couldn't fit an entire corporate database on a SD card.

FYI, many systems have size limits at 2GB (due to file pointer sizes being too small) including a lot of JavaLand ... I've lost track of how Progress is affected by this, though, did they fix this limit a while back?, I think so.
 
This was a moderated post so quoting it to bump the thread.

Yes, I know but i didn't decide for the progress version.
I use the ulimit -a and the result is :
-t: cpu time (seconds) unlimited
-f: file size (blocks) unlimited
-d: data seg size (kbytes) unlimited
-s: stack size (kbytes) 8192
-c: core file size (blocks) unlimited
-n: file descriptors 256
-v: virtual memory size (kb) unlimited

By treatment, i mean a .p programm.

Marilyn
 
I think it most likely that the error is not directly related to your output file size but rather is related to a temp-file such as the SRT file or the DBI file (these files are written to the directopry specified by -T. To *see* those you need to enable -t (lower case "t").

9.1B has a 2GB limit on such files. More modern releases permit "large files".
 
Back
Top