Create many files with random content
Since you don't have any other requirements, something like this should work:
#! /bin/bash
for n in {1..1000}; do
dd if=/dev/urandom of=file$( printf %03d "$n" ).bin bs=1 count=$(( RANDOM + 1024 ))
done
(this needs bash
at least for {1..1000}
).
A variation with seq
, xargs
, dd
and shuf
:
seq -w 1 10 | xargs -n1 -I% sh -c 'dd if=/dev/urandom of=file.% bs=$(shuf -i1-10 -n1) count=1024'
Explanation as requested per comments:
seq -w 1 10
prints a sequence of numbers from 01 to 10
xargs -n1 -I%
executes the command sh -c 'dd ... % ...'
for each sequence number replacing the % with it
dd if=/dev/urandom of=file.% bs=$(shuf ...) count=1024
creates the files feeded from /dev/urandom with 1024 blocks with a blocksize of
shuf -i1-10 -n1
a random value from 1 to 10
This uses a single pipeline and seems fairly fast, but has the limitation that all of the files are the same size
dd if=/dev/urandom bs=1024 count=10240 | split -a 4 -b 1k - file.
Explanation: Use dd to create 10240*1024 bytes of data; split that into 10240 separate files of 1k each (names will run from 'file.aaaa' through 'file.zzzz')