Copying files with ssh
My partner and I now code our CS 452 assignments exclusively in Eclipse. Its Auto Build feature lets us write code, save, and see all of our errors almost instantly. I have a very low pain threshold for build times, so when I’m working from home, this is pretty much a perfect coding environment.
Working from school is different, since I typically run our code on the fancy servers, which means I have to get the code off of my laptop. We once used a two-line script:
scp $FILES $LOGIN@$SERVER
ssh $LOGIN@$SERVER useThoseFiles
I’m transferring more files than I used to. Posting now takes an unreasonable 3-4 seconds, well above my pain threshold. I connect at 1Gbit in the labs, so bandwidth isn’t the bottleneck. I followed Alexei’s lead and compressed our files:
tar -cjf files.tar.bz2 $FILES
scp files.tar.bz2 $LOGIN@$SERVER
ssh $LOGIN@$SERVER tar -xjf files.tar.bz2 \&\& useThoseFiles
Alexei pointed out that posting from off-campus was much faster using compression, but I didn’t notice any speedups from compression in the labs.
My next guess was ssh. When we initially wrote this script, we noticed huge speedups when we went from one scp per file to just one giant scp. Some googling told me how to combine those two ssh connections into one:
tar cf - $FILES | gzip -c -1 | ssh $LOGIN@$SERVER cat ">" files.tar.gz \&\& gzip -d files.tar.gz \&\& tar -xf files.tar \&\& rm post.tar \&\& useThoseFiles
It’s not easy on the eyes, but it got us to a cool 2 seconds. A faster server eased the decompression time and brought us down to 1.3 seconds. For bonus points, we cut out a few temporary files:
tar cf - $FILES | gzip -c -1 | ssh $LOGIN@$SERVER gunzip "|" tar xf - \&\& useThoseFiles
Now that’s a script I’m proud to run.
3 comments3 Comments so far
Leave a reply
Ooh, very nice! I forgot how powerful piping is in Unix!
hi alex, good to hear from You.
You could also remote filesystem local on your computer with sshfs $LOGIN@$SERVER:/$REMOTEDIR $LOCALDIR
then you could also play with rsync to only get the latest files
however the script looks pretty nice 🙂
Hmm… we didn’t try that, so I don’t know how well it would’ve worked. Maybe it could’ve saved us a lot of effort 😉
The problem for us was making multiple connections, because setting up the connection proved to be way slower than doing the actual file transfer :/