mysqldump to a tar.gz
Solution 1:
mysqldump --opt <database> | gzip -c | ssh user@wherever 'cat > /tmp/yourfile.sql.gz'
You can't use tar in a pipe like this, and you don't need it anyway, as you're only outputting a single file. tar is only useful if you have multiple files.
Solution 2:
If you are running this locally just use the following command to backup your database & zip it using gzip:
mysqldump -u userName -p (passwordPrompt) yourDatabaseName | gzip -c > output.gz
(Edit: fixed -c key)
Solution 3:
Use a named pipe.
mkfifo mysql_pipe
gzip -9 -c < mysql_pipe > name_of_dump.gz &
mysqldump database > mysql_pipe
rm mysql_pipe
I use it all the time, it'a awesome.
http://en.wikipedia.org/wiki/Named_pipe
Solution 4:
I wrote a quick script to suck down a remote mysql database. It uses mysql compression, gzip and ssh compression. Sucked down a multi GB database at an incredible rate.
ssh -C user@host "mysqldump --opt --compress database <table> | gzip -9 -c" > outputfile.sql.gz
A side benefit is that it requires no free space on the source database server, so you can use it to backup a database on a server with zero free disk space before going in an pruning your data.
Hope it helps somebody.
Solution 5:
Use pv
and monitor rate!
mysqldump prod_db -h dbslave | pv | gzip -c > prod_2012_08_20.dump.tgz
Or, if you know the size (3GB), get an accurate estimate:
mysqldump prod_db -h dbslave | pv -s 3g | gzip -c > prod_2012_08_20.dump.tgz