Some Error in Progress. please help to resolve.

jk.karthick

New Member
While loading a file in QAD it generates one errror in the log file. the file name is "TMP504pt.log." It has shown the error "SYSTEM ERROR: Attempt to define too many indexes. (40)"

Can anyone suggest me a solution for this?

on my analysis while uploading a file which has more than 80,000 records and find more than 4000 records need to be modified, then its throwing this error.

please anyone suggest me a good solution for this?

please its urgent..............
 

joey.jeremiah

ProgressTalk Moderator
Staff member
Version ? prior to 8.2a ? OS and any other useful info would be helpful ?

From what I gather the problem is database blocksize related and may require changing the database structure.

I'd also bring it up @peg.com or atleast move the question to the dba list ( comments ? ).
 
joey.jeremiah said:
From what I gather the problem is database blocksize related and may require changing the database structure.

Or it may be a disk space error, or Temp-Tables not being handled properly, or a couple of other things. There are several suggestions in the KB.

joey.jeremiah said:
or atleast move the question to the dba list ( comments ? ).

Or QAD. But definitely not Comments.
 
jk.karthick said:
on my analysis while uploading a file which has more than 80,000 records and find more than 4000 records need to be modified, then its throwing this error.
I don't know anything about QAD, or how it works, but if it is building up large UNDO structures (eg. Temp tables), that will fill up disk space. So to rule that out, check how much space is allocated to the user, also see if your temp files (eg. dbi) are expanding more than normal.

And like Joey suggested, try the peg, as there are a lot of very knowledgable people there.
 

GauriShankar

New Member
Hi,

Rightly, u must have encountered such an error, I also did, mine case was the same as urs, loading a data file, this is because of junk data in the data file you r trying to load, my suggestion is to look thru the data file for junk data, it is especially when the data for a record in the file is junk in a way that it supplies much-much longer value for indexed field in the record....hopefully this shd solve ur problem....do reply once it is solved by this way or if it persists....


Bye...

Gauri
 
Top