Best way to transfer files over a LAN between two Linux computers
In a Linux environment, for both security and ease of use, ssh is the best way to go. SSH, SSHFS, SCP, and SFTP as you list are all just different services built on top of the SSH protocol. SCP is very easy to use, it works just like CP but you can provide user and machine names in the path. So, we might do a CP like cp ~/music/ ~/newmusic/
, but we could just as easily do scp ~/music/ user@host:~/newmusic
to send it to the computer named host. That's it - we don't need to set anything up. You'll be prompted for the account password on the other machine if you don't have certificate or some other authentication set up (scp shares those settings with ssh, of course).
SFTP is a tool that makes it easy to do a lot of operations on a remote file system - it works just like FTP, but it runs through SSH so it's secure and requires only an SSH server. man sftp
will tell you all about how to use it. I don't use SFTP just to move a folder between two machines, it's more useful when you have a lot of operations to do, like if you're rearranging files on another computer.
SSHFS just extends SFTP in to a file system: it allows you to mount a virtual host in to your file system, so the network stuff happens totally transparently. SSHFS is for semi-permanent setups, not just a one-time file transfer. It takes some more effort to get set up, which you can read about on the project website.
If you need to work in a mixed-OS environment, Samba becomes your next best bet. Windows and OS X support Samba completely automatically, and Linux does as well although it's sometimes rough to use.
My personal favorite for cases where security doesn't matter is netcat + tar:
To send a directory, cd to inside the directory whose contents you want to send on the computer doing the sending and do:
tar -cz . | nc -q 10 -l -p 45454
On the computer receiving the contents, cd to where you want the contents to appear and do:
nc -w 10 $REMOTE_HOST 45454 | tar -xz
Replace $REMOTE_HOST
with ip / hostname of computer doing the sending. You can also use a different port instead of 45454
.
What's actually happening here is that the 'receiving' computer is connecting to the sending computer on port 45454 and receiving the tar'd and gzip'd contents of the directory, and is passing that directly to tar (and gzip) to extract it into the current directory.
Quick example (using localhost as a remote host)
Computer 1
caspar@jumpy:~/nctest/a/mydir$ ls
file_a.txt file_b.log
caspar@jumpy:~/nctest/a/mydir$ tar -cz . | nc -q 10 -l -p 45454
Computer 2
caspar@jumpy:~/nctest/b$ ls
caspar@jumpy:~/nctest/b$ nc -w 10 localhost 45454 | tar -xz
caspar@jumpy:~/nctest/b$ ls
file_a.txt file_b.log
For one time moves, scp is recommended.
But if you find that this dir may work and you need to move it many times to keep the other position updated then you can use rsync (with ssh).
Since rsync has a lot of arguments I usually put it in a small shell so I get it right (every time). The idea is to only send things that has changed since the last time it ran.
#!/bin/bash
user="nisse"
host="192.168.0.33"
echo "Sync: /home/media/music/"
rsync --archive --delete -v --progress -e "ssh -l $user " /home/media/music/ $host:/home/media/music/
This will move a dir called "/home/media/music/" from the local computer to the the pc called 192.168.0.33, using user "nisse". And delete anything on the target that doesn't exist on the local pc.