I'm working on a tool to get some data that is buried within a very large file on a remote system. It would be impractical to copy the entire file over, and all of the data I need exists within the first 1000 or so bytes of the file. I know that I can start a get and just cancel it with ^C to get a partial file, however this would be difficult (if not impossible) to automate with any consistency.
I would like to tell my ftp client to only grab x bytes of the remote file and quit as soon as it has them. I've found a few windows clients that do partial downloads, but I haven't found anything in the ftp man page and documentation online is sparse.
I found this HowTo: http://cdsarc.u-strasbg.fr/doc/ftp.htx that suggests the following syntax:
ftp> get bigfile.dat:0-5000 bigfile.nxt
It is unclear to me if this is supposed to be implemented in the client or the server, but in either case, it doesn't seem to work in my environment. (Standard linux ftp client connecting to an FTP server running on z/OS)
Even when trying between the linux standard ftp client and a filezilla server on windows, my attempts fail in the following way
ftp> get green.gif:0-10c
local: green.gif:0-10c remote: green.gif:0-10c
227 Entering Passive Mode (9,42,91,226,4,105)
550 File not found
So the :0-10c is interpreted as part of the filename it seems. Fail. Any thoughts?
Best Answer
Use curl. From the man page:
However, note that the SIZE extension must be supported by the server for this to work.