Opened 10 months ago

Last modified 9 months ago

#18763 new bug

zip error: Out of memory (was adding files to zip file)

Reported by: un_spacyar Owned by: nielx
Priority: normal Milestone: Unscheduled
Component: Applications/Zip-O-Matic Version: R1/beta4
Keywords: Cc:
Blocked By: Blocking:
Platform: All

Description

Hi. When creating a zip file from my /home directory, the zip command fails with the following error:

zip error: Out of memory (was adding files to zip file)

The command that I am trying to run is: /boot> zip -ry home.zip home/ (this same command is used by Zip-O-Matic addon. Using Zip-O-Matic addon also fails, but instead of show the error message, just get stuck without processor or hard disk activity).

I have a /home folder of about 4 GB, with around 38.000 files.

This is surely some regression, because previously, I was able to compress the home folder without issue. Unfortunately, I deleted my previous states, so I cannot identify the previous good state, but my previous backup was created on: November 24, 2023.

So, the regression was between November 24, 2023 and hrev57506 from January 9, 2024.

Haiku version: nightly hrev57506 (32 bit) Zip package version: 3.0-4

Attachments (3)

ZIP_issue_try-1.png (181.2 KB ) - added by un_spacyar 9 months ago.
zip execution (first try)
ZIP_issue_try-2.png (153.8 KB ) - added by un_spacyar 9 months ago.
zip execution (second try)
ZIP_issue_try-3_ommit-fonts-folder.png (98.2 KB ) - added by un_spacyar 9 months ago.
zip execution (third time - excluding the home/downloads/fonts folder)

Download all attachments as: .zip

Change History (11)

comment:1 by waddlesplash, 10 months ago

It's possible that your home directory just grew in size, and previously it was very close to the limit and now has crossed it?

comment:2 by un_spacyar, 10 months ago

Hello. Normally my /home directory is around 4 GB, and is my first time that I face this issue in several years. Also, the zip fails around 1.7 Gb (fails consistenly: if I retry always fails at same size). Previously I was able to create ZIP files of around 4 Gb without issue.

comment:3 by un_spacyar, 10 months ago

Hi. I monitored the memory usage of the ZIP process (using Process Controller), and the process grows until 120 Mbytes approximately, and then fails. It never manage to exhaust the memory available in the system.

in reply to:  2 comment:4 by madmax, 10 months ago

I've just zipped a ~9 GB directory with no problem in a VM I had with hrev57424. It also worked after updating to hrev57521. I also tried something smaller but with 70K+ files.

Replying to un_spacyar:

fails consistenly: if I retry always fails at same size

Do you have enough free space for the new backup?

Is it also always the same file? Anything strange with it?

Can you extract your last backup and try to compress that to see if it is due to something new in your files?

Update to current? There were a few fs changes just before hrev57506 and a few more afterwards, maybe that is an unlucky version.

comment:5 by un_spacyar, 10 months ago

Hi madmax, thanks for the information. There are my answers:

  • I have enough free space for the backup.
  • I tried expanding and try to compress again: I was able to compress it successfully (the folder had 4,3 GB, and around 38.000 files). So, as you said before, probably the issue is related to some new files. I compared both /home/ folders:

-- The one from the successfully backup: 4,3 Gb - around 38.000 files.

-- The one from the failed backup: 3,92 Gb - around 33.000 files.

I monitored the memory usage of the ZIP process for each of the /home/ folders: the one from the successful backup: peaks around 100 Mbytes and finish successfully. the one from the failed backup: peaks at 130 Mbytes and then fails.

My guess is that the 'memory limit' is something around 130 Mbytes (not sure why: I still have plenty of available RAM, but maybe is some limitation of the zip application). I need to investigate why the failed attempt consumes more RAM. Maybe is something to do with the filenames? I noticed that the memory usage grows faster when processing lot of small files, and keeps stable when processing a few bigger files. Maybe the memory usage is related to file names and paths storage.

Last edited 10 months ago by un_spacyar (previous) (diff)

comment:6 by madmax, 10 months ago

There's no such low memory limit. I've just downloaded your same hrev57506 and zipped a 7.5GB tree with 377238 files with no issue. zip's memory usage went over 250MB.

zip shows what files it works on, something like "adding: home/bla/foo" and then "(deflated 37%)" when it has processed it. Please check if it always fails on the same one, or try to zip a subset of subdirectories or files to see if it is a problem with one specific file. Maybe even run checkfs.

comment:7 by un_spacyar, 9 months ago

Hi. Sorry for my late answer. I tried the suggestion from last comment. I executed the following command from Teminal:

zip -ry home1.zip home/

I executed it twice, to check if the process fails at the same file. It did not failed exactly on the same file, but both of them failed inside a folder named: /home/downloads/fonts/ (See pictures ZIP_issue_try-1.png and ZIP_issue_try-2.png).

I tried a third time. This time, I ommited the /home/downloads/fonts folder, using this command:

zip -ry home1.zip home/ -x "./home/downloads/fonts/*"

On this third time, it fails at: /home/downloads/icons/ folder (See picture ZIP_issue_try-3_ommit-fonts-folder.png)

by un_spacyar, 9 months ago

Attachment: ZIP_issue_try-1.png added

zip execution (first try)

by un_spacyar, 9 months ago

Attachment: ZIP_issue_try-2.png added

zip execution (second try)

by un_spacyar, 9 months ago

zip execution (third time - excluding the home/downloads/fonts folder)

comment:8 by madmax, 9 months ago

To recap:

  • That you can compress your old backup points to this not being a regression.
  • The memory used by zip does not seem to skyrocket.
  • The error is indeed a NULL returned by malloc for a struct which should be less than 200 bytes long, not due to some hardcoded limit.
Note: See TracTickets for help on using tickets.