You are not logged in.
Salutations!
Issue is simple.
I have a ~25GB archive I want to extract.
I start extracting it,
The PC fans start going crazy
The computer starts breaking down and freezing and stuttering everywhere
Despite me having an i5 8th gen, and 16 GB RAM DDR4
So, I go check htop
and p7zip is using 100% of ONLY ONE CPU Core to extract the zip
I have no idea what to do. Any ideas?
Thank you!
Both depressed, And obsessed with tech!
That's me
-DavidRLTG
Offline
DavidRLTG wrote: have a ~25GB archive I want to extract.
I start extracting it,
The PC fans start going crazy
Don't know all the in and outs of file compression. I have had some strange phenomena as well. Often compression is easier to compile than extract. That is a large file, even for a zip or tar. Have you tried another archive manager, they are not created equal. I use both Engrampa and file-roller.
I have gotten away from extremely large packages other than an iso. Easier to transfer smaller portions from one location to another. I'm sure you will get other help, feedback, and support. Surely there are other techniques to do what you are attempting.
zephyr
CROWZ
easier to light a candle, yet curse the dark instead / experience life, or simply ...merely exist / ride the serpent / molon labe / III%ers / oath keepers
Offline
What I would do is install cpufrequtils (dep libcpufreq0,) tiny program.
And run, cpufreq-set -u 800000
Take a look at, cpufreq-info
I done the simples config files /etc/default/cpufrequtils|loadcpufreq and have it at 80% all the time.
Very on the fly handy is, cpufreq-set -u
Not difficult to destroy cpu, memory, storage
Offline
In a terminal: Both commands will extract the archive's content in the current directory
$ 7z /path/to/arch.zip
$ unzip /path/to/arch.zip
Multitasking in compression is a trick that zst does.
I don't use it very often, since the command-line switches and defaults are sick, but (de-)compression works nicely.
Offline
Not sure if this will help.
Look at the memory usage when extracting a file from an archive.
The only time I saw 16GB fully used on my computer was when copying a 30GB file, Linux was using memory for the buffer.
Offline
Maybe the answer lies in one of these - https://duckduckgo.com/?t=ftsa&q=extrac … nux&ia=web
Offline
I once had a problem with writing an image whose size exceeded the size of RAM using the dd command while simultaneously turning on tmp in memory. The recording just froze. When I turned off tmp, the problem disappeared.
Then, with newer kernels, such errors did not occur.
Offline
Could the system be overheating? Check for dust etc in the CPU cooler, vents, etc. Going over the system with a vacuum cleaner can make quite a difference. As can removing obstructions to air flow etc.
You could run watch sensors in one window to see how hot the system is getting.
Or free -mt or vmstat to see how much paging it's doing.
Offline