Forum Post: Re: Performance Degradation After Dump&load

  • Thread starter Thread starter George Potemkin
  • Start date Start date
Status
Not open for further replies.
G

George Potemkin

Guest
> RECORD-LENGTH of template record is 236 Are you sure? Dbanalys says that min record size is 210 bytes. And it was a database after load - hence the schema versioning is not applied here. The size reported by dbanalys is 2 bytes higher than the value returned by RECORD-LENGTH() function. It means that your table has at least one record with RECORD-LENGTH 208 bytes. > there is one field which takes about 1/4 of record size I expected the bigger contribution. How large was the record or field? Max record size is 1667 bytes. I'd check the records with 1K size or larger. > Why it's important ? And how did you know that ? There are two types of the tables ;-) - the ones with record sizes where (mean - min) close to (max - mean) and the ones where log(mean/min) close to log(max/mean). Your table seems to belong to the second type. The difference is a distribution of record sizes. > That means, that during normal production when newly created record is allocated it takes at least 236 bytes, even if it's not filled yet, right ? If average record size is around 400, that means it will afterwards grow by around 170 bytes. That means create limit should be at least 170, am I right ? Most likely you're right. But I would also check _TableStat. Namely the ratio of _Table-create to _Table-update. If it's 1:1 with high presision then you're 100% right. Or check the slots used by the fragmented records in your production database (it's reported by AreaDefrag). I guess you will see the slots 15, 16 or so and no slots with low numbers.

Continue reading...
 
Status
Not open for further replies.
Back
Top