Ubuntu – FTP Directory with over 3 million files. Suggestions

backupftpUbuntu

I've inherited an Ubuntu Server that is running an instance of Red5 which is storing the streams in a single directory. The streams directory has grown to over 3 million files, and the drive is reaching it's capacity. I've been asked to delete files that are over a certain date, and according to a database record lookup which I have written the script for. I realized before I ran the script that there is absolutely no backup for these streams and I am of course hesitant to run my script before I make a back up.

I do not have physical access to this server unfortunately, so I need to FTP the files. From another system, I connected via FTP and attempted to download the files via FTP to another machine using Filezilla client. The problem is Filezilla's last response is "150 Here comes the directory listing" but it hangs there for hours before it finally crashes Filezilla. It appears to be waiting for the LIST so it can build up the Queue I guess.

I tried to ssh into the server and a simple mput * in simple ftp, which works but its only uploading 1 file at a time which is going to take forever… I was hoping to use something like Filezilla so I could set multiple connections and download several at a time.

Is there an alternative that I can use here to send these files to another server? The other server isnt on the same network by the way, but the connection between is very fast.

Any help would be greatly appreciated. Thanks!

Best Answer

Some suggestions:

  • Try rsync.
  • You could try lftp instead of ftp. It can handle multiple connections.
  • In Your deletion-script replace "rm" with "echo" and check the output.
  • Maybe, You could use FXP, i.e. transferring the files directly between ftp servers. However, all servers I know have this feature disabled for security reasons.
Related Topic