Linux – how to copy & gzip files individually over ssh

gziplinuxssh

I'd like to transfer a directory with uncompressed files over using ssh, gzip'ing them individually along the way. Can anyone recommend a simple oneliner to achieve this?

eg.

fileA.ext -> ssh/pipe -> fileA.ext.gz

I've been piping tar over ssh with compression, but then the files are uncompressed at the end of the pipe. In this case, I'd like them to stay compressed.

Compressing beforehand would be possible, but would require space locally, or require a connection per file(?)

There are 6000+ files, and a I'd prefer a solution where all the files could be transferred using a single connection (although I do use keys for authentication!)

Best Answer

I suppose you rather want to gzip them first, then send them across.

cat dafile | gzip -c | ssh remote "cat > dafile"

Repeat with relevant loop construct. E.g.

find -type f | while read fname ; do cat $fname | gzip -c | ssh remote "cat > $fname" ; done

... or something to that effect. Need to set up pkey access, or this will be one massive password-speedtyping excercise.

EDIT:

On the subject of a single connection, see man ssh_config ControlMaster. This will mean you save the overhead of negotiating 5999 of those 6000 SSH sessions.

EDIT2:

Haha! I win!

tar zcf - /dir/of/files | ssh remote "tar zxf - --to-command=mygzip.sh"

mygzip.sh, present on the remote machine, looks like this:

#!/bin/sh
mkdir -p "$(dirname $TAR_FILENAME)"
gzip -c > "${TAR_FILENAME}.gz"