Nginx – Force download of zip file stopping a few thousand bytes short

compressionnginxPHP

I'm trying to start a force download of a zip file containing a large amount of images. I'm using php to force the download and it's running on nginx. The weird thing is when i use wget to download the file it usually works, but when I try and download it using a browser the file ends up being a few thousand bytes to small. I have tried multiple file sizes with similar results. I don't think it's a timeout error because i played with slowing down my download speeds and it always stops at the same place (on that specific file). I have no idea why it would work on wget but not a browser. I have tried it on multiple computers (with different browsers) and multiple networks and got the same results. I have played around with the PHP code that is serving up the file and with php.ini and nginx settings to no avail. I would really appreciate any help or suggestions anyone might have, it would really help me out!

Here is my PHP code that is forcing the download:

    ob_end_flush();
    header('Content-Description: File Transfer');
    header('Content-Type: application/zip');
    header('Content-Disposition: attachment; filename="'.$name.'.zip"');
    header('Content-Transfer-Encoding: binary');
    header('Expires: 0');
    header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
    header('Pragma: public');
    header('Content-Length: ' . filesize($archive));
    flush();
    readfile($archive);

where $archive is the path to the zip file the user is trying to download.

Best Answer

Could your webserver be applying gzip Content-Encoding to the file you're outputting from PHP (or could you have enabled it in PHP itself eg zlib.output_compression)? Zipping an already zipped file tends to make the actual file larger, so if the client reads the script's Content-Length header, it'll stop downloading before the "larger" zipped file is output.

wget does not support compression, so the server would not attempt to use Content-Encoding when you connect with wget. Try it with curl and see if you get the broken file. If you use Chrome, Ctrl-Shift-J to open the Developer Tools and go to the Network tab. While that's open, go back to the main window you opened it from and hit the link to download the file. It should create an entry in the request list you can click on, which will open a subwindow with a "Headers" tab, where you can see if it had the Content-Encoding: line in the Response Headers: section.

BTW, I think you want to do ob_end_clean() there, rather than ob_end_flush() since _flush() sends the client anything you have in the buffer, while _clean() drops the buffer and any headers already in it.