How to get all the files exceeding certain size and deleting them

Similar to the exec rm answer, but doesn't need a process for each found file:

find . -size +100k -delete

One-liner:

find . -size +100k -exec rm {} \;

The first part (find . -size +100k) looks for all the files starting from current directory (.) exceeding (+) 100 kBytes (100k).

The second part (-exec rm {} \;) invoked given command on every found file. {} is a placeholder for current file name, including path. \; just marks end of the command.

Remember to always check whether your filtering criteria are proper by running raw find:

find . -size +100k

Or, you might even make a backup copy before deleting:

find . -size +100k -exec cp --parents {} ~/backup \;

python is installed on all unix based OS, so why not use it instead of bash ?

I always find python more readable than awk and sed magic.

This is the python code I would have written:

import os
Kb = 1024 # Kilo byte is 1024 bytes
Mb = Kb*Kb
Gb = Kb*Kb*Kb
for f in os.listdir("."):
    if os.stat(f).st_size>100*Kb:
        os.remove(f)

And this is the one-liner version with python -c

python -c "import os; [os.remove(f) for f  in os.listdir('.') if os.stat(f).st_size>100*1024]"

And if you want to apply the search recursively, see this