Bash script to delete files older than x days with subdirectories
type
option for filtering results
find
accepts the type
option for selecting, for example, only files.
find /path/to/files -type f -mtime +10 -delete
Leave out -delete
to show what it'd delete, and once you've verified that, go ahead and run the full command.
That would only run on files, not directories. Use -type d
for the inverse, only listing directories that match your arguments.
Additional options
You might want to read man find
, as there are some more options you could need in the future. For example, -maxdepth
would allow you to only restrict the found items to a specific depth, e.g. -maxdepth 0
would not recurse into subdirectories.
Some remarks
I wonder how the command would have removed a folder, since you can't remove a folder with
rm
only. You'd needrm -r
for that.Also,
/path/to/files*
is confusing. Did you mean/path/to/files/
or are you expecting the wildcard to expand to several file and folder names?Put the
{}
in single quotes, i.e.'{}'
to avoid the substituted file/directory name to be interpreted by the shell, just like we protect the semicolon with a backslash.
As in previous answers (+1 for both) the trick is to use -type f
predicate.
Note, that instead of -exec rm '{}'
you can also use -delete
predicate. But don't do that. With -exec rm '{}'
you can (and should) first do -exec echo rm '{}'
to verify that this is really what do you want. After that rerun the command without the echo
.
Using -delete
is faster (no extra fork()
and execve()
for each file), but this is risky because -delete
works also as a condition, so:
# delete *.tmp files
find . -type f -name '*.tmp' -delete
but if you ONLY swap arguments:
# delete ALL files
find . -type f -name '*.tmp' -delete
If you ever need find
and rm
to work faster for tons of files, check out the find ... | xargs ... rm
UNIX idiom.