Opened 15 years ago

Closed 15 years ago

#3874 closed bug (fixed)

Zipping many files bails with 'out of memory'

Reported by: zooey Owned by: zooey
Priority: normal Milestone: R1
Component: Applications Version: R1/pre-alpha1
Keywords: Cc:
Blocked By: Blocking:
Platform: All

Description

If you run zip on a large set of files (in my case: ~40000 mails), the process size grows constantly (but moderately) and finally bails with an 'out of memory' error.

I did not follow the process precisely, but I doubt that the zip process did exceed 100MB in size (on my 2GB machine running haiku natively). That should not be a reason for out-of-memory, should it?

Anyway, since I can reproduce it here, I will look into it later today.

Change History (5)

comment:1 by mmadia, 15 years ago

i've seen this too. both with mozilla 1.8's cvs code and zipping haiku's source directory. both directories were identified, eg right click the top level directory and chose "Identify".

comment:2 by bonefish, 15 years ago

This might have to do with how we reserve memory for areas. If the allocation that failed was a realloc() of a large chunk of memory, the heap area would have to be resized, involving reserving memory for the new range. If most of the available memory is bound in block caches at that time, there might not be enough memory to reserve at that time. Our current, somewhat crude strategy is to trigger the low resource manager -- which tries to free caches and the like -- and wait for at most one second. If then there's still not enough memory available, the operation just fails. I.e. if the low resource manager is a bit slow, an area creation/resize can fail although it wouldn't really have to.

Anyway, the amounts of memory involved sound a bit off for that kind of problem. Normally one has one block cache block per inode. I'm not sure how BFS works with respect to attributes that don't fit the small data region of the file; I believe they get inodes of their own. But still, with 2 GB of memory one would need to use about 50 blocks per file to exhaust the memory, which doesn't sound particularly realistic.

comment:3 by mmadia, 15 years ago

as a note, tar cjf <args> is able to compress both the mozilla 1.8 cvs branch and haiku-hrev30629's svn tree.

comment:4 by zooey, 15 years ago

Status: newassigned

comment:5 by zooey, 15 years ago

Resolution: fixed
Status: assignedclosed

Should be fixed by hrev30653 - at least I could now zip up all my mails.

mmadia: please reopen if you still experience problems

Note: See TracTickets for help on using tickets.