Moving very large (~100 Gb) from one server to the other

file-transferlarge-data

We're moving servers, and I need to transfer all the data from Server A to Server B.

I have a tar.gz of about 100Gb that contains all the Server A files.

I'd really like to avoid downloading the file locally on my computer, and uploading it to Server B.

I only have ftp access to Server A. That means no ssh. However, I do have ssh access to Server B.

What's the best way to transfer the file ? I was thinking of moving my tar.gz file to public_html temporarily, and download it using wget. Would that work ?
Otherwise, I could use ftp through an ssh session on Server B.

Best Answer

Something like:

ssh user@serverB
nohup wget -bqc ftp://path/largefile.tar.gz

wget options:

-b : run in background
-q : quiet
-c : resume broken download (means you can restart if it breaks)

This runs wget in the background so (hopefully) if you exit the ssh shell it'll keep going. Ok, I think you need nohup to ensure that is the case when/if you logout

Because you're initiating the download from serverB, your desktop machine isn't involved in the file transfer except to set it up.