Break a large file into smaller pieces
You can use split and cat.
For example something like
$ split --bytes 500M --numeric-suffixes --suffix-length=3 foo foo.
(where the input filename is foo
and the last argument is the output prefix). This will create files like foo.000 foo.001
...
The same command with short options:
$ split -b 100k -d -a 3 foo foo
You can also specify "--line-bytes" if you wish it to split on line boundaries instead of just exact number of bytes.
For re-assembling the generated pieces again you can use e.g.:
$ cat foo.* > foo_2
(assuming that the shell sorts the results of shell globbing - and the number of parts does not exceed the system dependent limit of arguments)
You can compare the result via:
$ cmp foo foo_2
$ echo $?
(which should output 0)
Alternatively, you can use a combination of find/sort/xargs to re-assemble the pieces:
$ find -maxdepth 1 -type f -name 'foo.*' | sort | xargs cat > foo_3
You can also do this with Archive Manager if you prefer a GUI. Look under 'Save->Other Options->Split into volumes of'.