How do I minimize disk space usage
Here are some points you could start with:
Have a look at the packages installed on your system with
pacman -Q
and remove the ones you don't need. A good start may be to append the-t
switch:Restrict or filter output to packages not required by any currently installed package.
Clean the package cache of pacman with
pacman -Sc
Always use
pacman -Rs
to remove also unused package dependencies.To find "big files" and folders which use large parts of the disk, a nice addition to
du
isxdiskusage
. This little tool lets you quickly browse your filesystem and see graphical representation of the disk usage of the folders.
WARNING: These ideas are only for users who are well-versed in both Linux as a whole and Arch Linux.
If you're willing to tread into dangerous territory, you can slim a base Arch install down to less than 500MB installed. This requires doing some very dangerous things:
- removing all unnecessary locales (already covered)
- removing any firmware files not needed to run your system (from
/usr/lib/firmware
) - removing any kernel modules not needed to run your system (from
/usr/lib/modules/...
) - removing any
.a
files in/usr/lib
(only if you never use the system to compile software. note: this includes usingmakepkg
) - removing everything in
/usr/include
(only if you never use the system to compile software) - removing unneeded documentation from
/usr/share/doc
and/usr/share/info
- (VERY BAD IDEA unless maybe for a server) removing man pages from
/usr/share/man
- (also a bad idea) removing unneeded terminal descriptors from
/usr/share/terminfo
and unneeded timezone files from/usr/share/zoneinfo
- (DANGEROUS) running
strip *
on all folders containing executable binaries (/usr/bin
and/usr/sbin
) - (in extreme situations) using a tool such as
upx
to compress larger binaries (the Samba binaries lend themselves to this well as they tend to be quite large since they are often compiled statically.) Also note that usingupx
means the entire uncompressed binary must fit in RAM during execution, so be weary on systems with low RAM.
Before you do ANY of this, MAKE A FULL BACKUP of your system. Linux thankfully makes this relatively easy - if you can attach and mount an external volume (e.g. a USB drive) you can do something like cd / && tar -cf /mnt/usb/mySystem.tar /
to backup the entire system.
Once again, note that I do not actually recommend doing the above (especially 7 through 9) unless you have in-depth knowledge, experience and understanding of Linux internals and Arch Linux. Playing with just about any of the files I've listed can damage a system in horrible ways, so you've been warned. If you don't know if your system needs a certain firmware file, module, etc. then do your research before you mess with it. (Be warned that removing kernel modules that your system needs can result in an unbootable system, or a system with no keyboard/network card/sound/display/etc. support, or all sorts of other unexplainable behavior.)
Also please note that any package upgrades can and will restore many of the files you remove above. If you do decide to go this route, you may wish to eventually script the removal of unneeded files and run your script after every major package upgrade. (Example: upgrading the kernel will bring back all the kernel modules as well as upgrade via dependency the linux-firmware package bringing back all of the firmware in /usr/lib/firmware
.)
Finally, keep an eye on /var/log
as the journal files will grow over time. You can remove past journals but keep the current ones by doing something like rm *\@*.journal
in your journal folder.
I've successfully ran a NAS server off a 512MB Disk-on-Module device for months using these techniques, however they're still not for the faint of heart. (I've also used LinuxFromScratch to build a similar project in only 128MB of storage, but that's another story...)
EDIT/ADD:
Here's a few more methods you can use to try to get some extra space:
Getting rid of
libgo
. Thelibgo
library comes withgcc-libs
and is AFAIK only used by applications written in the Go language. I can't think of any application I use that needs that library. On my system it's 40MB in size. When you're trying to slim way down, that's a lot of space. I've removed it from my "mini" installs with no ill effects on anything I do (but again, that's me, YMMV!)Shrinking
libicudata.so
. It's 27MB on my system. It's basically a ton of unicode/locale data compressed into a library object. There is a tool online that can make smaller versions of this file, but it hasn't been updated for the current version (and you can't use older files in newer releases.) I haven't tried doing this by hand, but if you can figure out how, you can shave about 20-22MB off this file.If you use Python, you can save 37MB or so by removing the
test
library from python2:rm -r /usr/lib/python2.7/test
and about 66MB by removing it for python3:rm -r /usr/lib/python3.6/test
.Again for Python, you can get rid of the
.pyo
files and the.py
files. The.pyo
files are "optimized" files, but Python never really uses them. The.py
files are the raw source code for the standard library. The only files Python normally reads when running Python code is the.pyc
(python compiled) files.cd /usr/lib/python2.7 && find . -name "*.pyo" -exec rm -v {} \;
andcd /usr/lib/python3.6 && find . -name "*.pyo" -exec rm -v {} \;
Removing unnecessary locale data. There's an AUR package called
localepurge
which automates this. Otherwise, you have to play around in/usr/share/locale
. You need to keep your own locale andlocale.alias
. For me here in the US, keepingen_US
andlocale.alias
and removing everything else shaved off about 80MB.
Now, what I want to see is a tool that analyzes your system and determines which kernel modules you need, and also which firmware files you need. That would be a nice way to "safely" clean out those folders...
Or, maybe someone should put together an Arch "distro" that uses uClibc
or diet-libc
or something similar. That might be a fun summer project. :-)
TLDR
journalctl --vacuum-size=100M #remove all logs, only retain 100mb
pacman -Scc #remove all package installation files (obsolete and current)
pacman -S bleachbit
bleachbit -c system.*
First, what's big on the system
du -d1 -h / 2>/dev/null | sort -h
This shows a sorted list of the largest dirs in /
You can do two levels down:
du -d2 -h / 2>/dev/null | sort -h
My result is:
0 /proc
0 /sys
0 /tmp
12K /dev
12K /srv
16K /lost+found
632K /run
4.3M /boot
13M /opt
15M /etc
75M /root
93M /home
2.4G /var
3.2G /usr
221G /mnt
227G /
I ignore /mnt
(because that's an external drive)
Two dirs stand to mind: var
and usr
.
Let's see what's inside:
du -d1 -h /var /usr 2>/dev/null | sort -h
Then, a little bit deeper:
du -d1 -h /var/log /usr/share /usr/lib /var/cache 2>/dev/null | sort -h
Let's start with the logs
I have 717mb in /var/log
.
I'm not a fan of deleting directories randomly, so let's do it the clean way:
$ journalctl --disk-usage
Archived and active journals take up 728.7M on disk.
Let's leave only 100mb of logs:
journalctl --vacuum-size=100M
...
Deleted archived journal /var/log/journal/ba5391...b.journal (8.0M).
...
Vacuuming done, freed 616.6M of archived journals on disk.
More info here on how to configure journalctl here.
Packages
I have 660M /var/cache/pacman
. It was 1.8gb, but I ran pacman -Sc
to remove unused packages. Let's remove the rest:
pacman -Scc
/usr/share/locale
A lot of users do remove it, or at least clean it up. But I might be a problem. But checkout bleachbit (next paragraph).
Bleachbit
Automatic cleaner. Will delete a lot of stuff, but for it was mostly locales.
$ pacman -S bleachbit
$ bleachbit -p system.*
Disk space to be recovered: 488.8MB
$ bleachbit -c system.*
You can look for more stuff to delete:
bleachbit --list
bleachbit -p thunderbird.*