Search results

  1. G

    Extents

    I will try and let you know thanks a million George Govotsis
  2. G

    Extents

    Dear taqvia, Yesterday we created a new DB and we started loading the .d files. This particular Area has couple of tables that they have already been loaded since yesterday. the last Table that we have no longer the need to load is the massive one. while we were loading the last table in this...
  3. G

    Extents

    Dear All, After we finished a long and painful dump on a 95GB DB, we have created a new DB with multiple areas as well multiple extents. On one spesific area we have allocated 17 extents (16 Fixed 2096960 size + 1). The reason for that was because we had a massive table belonging to this...
  4. G

    Re: UNIX Maximum file size exceeded

    # oslevel 5.3.0.0 # -bash-3.00$ ulimit -a core file size (blocks, -c) 1048575 data seg size (kbytes, -d) 131072 file size (blocks, -f) unlimited max memory size (kbytes, -m) 32768 open files (-n) unlimited pipe size...
  5. G

    Re: UNIX Maximum file size exceeded

    Dear Tom, The DB is our backoffice system that contains multiple companies. I am using a sript that scans the db and extracts the data linked to the company I have specified at the script. In total there are 900 tables couple of them they will be more than 2GB. if I use the dictionary I will...
  6. G

    Re: UNIX Maximum file size exceeded

    Hi all, I am trying to do a dump of d files from a 85GB DB. After hours of working there is a table that is huge and exceedes the 2GB limit as a result I am getting "UNIX maximum file size exceeded. . (303)". How can I bypass this file size limit? Thanks, George Govotsis
  7. G

    Hi every body

    Just to say hello to everybody I just joined your Community Thanks, George Govotsis
Back
Top