FTP to GCS Transfer

backupftpgoogle-cloud-storagesftp

I'm wondering if there is an efficient way to backup files from an (S)FTP Server (Exavault), to Google Cloud Storage (GCS). We are using Exavault a managed FTP service and want to regularly back up files to GCS, then purge files older than 30 days from Exavault.

Current Implementation (Bloody Slow):

  1. Mount FTP to filesystem on Google Compute Instance using curlftpfs
  2. Run gsutil rsync to sync files from mounted storage to GCS Bucket
  3. Delete files based on mtime +30

This method is running really slow and I don't think it will be a reasonable solution at this point.

Are there any solutions that can do something like this, with ~ 500GB of data, more efficiently?

Best Answer

Multiple ftp clients, by mounting each top level directory in the system as a separate curlftpfs. Send to multiple cloud servers if maxing out the bandwidth.

Offline Media Import. Assuming you can get everyone involved to deal with physical media.

Related Topic