Showing results for 
Search instead for 
Did you mean: 

Downloadable files cut off/corrupt at just over 1GB mark

Downloadable files cut off/corrupt at just over 1GB mark

Hello everyone,


We've had a problem with our Magento store for a long time now (used to be Magento 1.4, recently upgraded to 1.9.2) and after a few days pulling my hair out trying to find a solution without success, I thought I'd post here in case anyone has had similar problems or has any ideas what else to try.


Our Magento store sells some digital products - ranging in size from a few hundred MB to a couple of GB.


A lot of customers complain on the larger products that their downloads cut off at around the 1 gigabyte mark, and lots of testing ourselves over the last few days I've been unable to figure out the cause or a solution.   I've been testing with 1.9GB file that has caused problems for many of our customers.  (Note however that the download does not fail *every* time -- just more often than not!)


When the file is downloaded directly -- via the filesystem or a custom test .php script -- the download almost always finishes in its entirety.  

When downloaded via Magento downloadable product link (testing via 'sample'), the download almost always stalls and then cuts off at around 1.08GB - 1055MB (different byte sizes each time -- have hex-inspected the end of the downloaded files and can't see any incorrect headers etc).  This is the same thing that customers report, and I can see this in the log files too -- but not everyone experiences this, and not *every* time -- just the vast majority of times!


Our Magento 1.9.2 is running on a dedicated server, running Centos with nginx.


Things that I have tried changing in order to solve this problem are:

  - disabling CloudFlare (but problem occurred before I ever used CloudFlare, and persists after it is disabled)

  - disabling nginx gzip support

  - increasing PHP.ini gc.max_session_lifetime setting

  - increasing max_execution_time setting

  - set the 'sample link' to use a custom php script that uses 'X-Accel-Redirect' (script downloads OK when accessed directly, but commonly fails at ~50% when used through Magento)


I've attempted to inspect the HTTP traffic to find out exactly what's going on, but have been unable to.  I also haven't been able to spot any pattern around when 


When inspecting the request&response headers between the 'working' direct-download requests and the non-working Magento download requests, I can see some differences, for example:

  Magento sends Content-Type as application/zip, custom scripts send Content-Type as application/octet-stream;

  Magento sends Expires: header, custom scripts do not. (Expires: header says 1981, if that may make any difference!!)

  Magento sets cookies, the custom scripts do not.

  Magento sends an X-Powered-By: PHP/5.4.27 header, the custom scripts do not;

  Custom scripts send an ETag: header, Magento does not;


Short of attempting to implement a replacement Downloadable class that hands file downloads off to nginx via X-Accel-Redirect -- and I don't know if that would even help or just be a waste of time? -- I don't know what to try next.  


If anyone has any ideas where to look, light to cast on the technologies involved - or even better if you've had this problem yourself and know how to fix it! -- please get in touch!  Thanks!


Re: Downloadable files cut off/corrupt at just over 1GB mark

Could it be that you somehow have global output buffering enabled in your Magento installation with a PHP memory limit of 1gb? If so, that could explain why your downloads get corrupted at roughly the 1gb mark, which would be when PHP runs out of memory to buffer any further output.


If you search your entire project for a call to ob_start() (, on a clean install, only 4 instances should show up:


Line 219 in app/code/core/Mage/Core/Block/Template.php

Line 260 in app/code/core/Mage/Core/Model/Translate/Inline.php

Line 82 in app/code/core/Mage/Page/Block/Html/Topmenu/Renderer.php

Line 37 in app/code/core/Mage/Sales/Model/Email/Template.php


Any other instances of a call to ob_start() could point at output buffering being enabled outside of your standard Magento installation.


Another approach you can try is to add the following snippet to the top of _processDownload() in app/code/core/Mage/Downloadable/controllers/DownloadController.php

while (ob_get_level()) {

All this does is explicitly disable any output buffering that may be active.


On a more general note: Magento does serve all its downloadable products via PHP, which in the worst case, depending on your set up, means that a PHP worker process is tied up for the duration of the download. If you have a user downloading a large product using a particularly slow connection, this could impact your site's capacity.


At iWeb, we've solved this problem in the past by implementing a custom module which creates timebombed urls using symlinks, then 301 redirecting the customer to this temporary url that points to the symlink, which points to the downloadable file instead. A cron process then clears up expired download links. The advantage of this approach is that after the initial download request, PHP is bypassed entirely for the duration of the actual file download, therefore leaving precious resources available to serve actual dynamic pages.

Problem solved? Click Accept as Solution! | Magento Small Business Partner

Re: Downloadable files cut off/corrupt at just over 1GB mark

Thanks for your response iweb_bas!  Definitely some things for me to try there.


My PHP memory limits do not appear to be set to 1024MB, so unfortunately I don't think that is the problem...


Doing a grep over my codebase finds a few more than 4 ob_start()s in my code and templates -- I'll have to dig into that further and find out if they are still used, and if they can be removed...


I've just tested _processDownload() with the while...ob_end code you suggested but have seen no improvement. I'm under the impression that disabling output buffering is not as straightforward in nginx/PHP-FPM as it is in Apache/PHP, so I'm unsure if those calls actually 'work' to disable output buffering on my setup.  


Your workaround of creating timebombed symlinks is an interesting approach and I may explore options like that, although I think I'll attempt a modification to use X-Accel-Redirect instead when I get a chance as I believe that will have the same benefits you describe.


Thanks again for your assistance, hope to get to the bottom of this soon!

Re: Downloadable files cut off/corrupt at just over 1GB mark

X-Accel-Redirect would indeed achieve the same result, threedtotal, and is probably your best bet at resolving this issue.


If nothing else, it takes the entire PHP stack (and all its peculiarities) out of the equation, which in turn means that if the problem persists, it is most likely some other layer in the stack behaving incorrectly.


The easiest way to add support for X-Accel-Redirect would be to "hijack" the download url in the DownloadController by injecting a custom router in some custom module, like so:



                        <Custom_Downloadables before="Mage_Downloadable">Custom_Downloadables</Custom_Downloadables>

You can then write your own DownloadController, which extends the original and extends the _processDownload() method, like so:


require_once Mage::getModuleDir('controllers', 'Mage_Downloadable').DS.'DownloadController.php';

class Custom_Downloadables_DownloadController extends Mage_Downloadable_DownloadController
    protected function _processDownload($resource, $resourceType) {
        // Set X-Accel-Redirect header correctly here and output the response

Good luck and do keep us updated when you end up with a working implementation!


Problem solved? Click Accept as Solution! | Magento Small Business Partner

Re: Downloadable files cut off/corrupt at just over 1GB mark

Thanks again iweb_bas for pointing me in the right direction and providing the quickstart..


I've written an extension that largely does what we discussed, passing the connection over to nginx via X-Accel-Redirect, and all my tests since deploying it have seemed to confirm that the problem is largely solved!


Only problem I've found so far is that seems to have it disabled the 'Pause/Resume' behaviour in Internet Explorer.


Just to wait and see now if any customers complain or have any further complications!


Thanks again -- will report back with any extra info I can.

Re: Downloadable files cut off/corrupt at just over 1GB mark

It's working for me.

Thanks for a simple and good solution.