Electronic – Why do fans keep running after power-off

fan

What is the reason why the fan of many electronic devices, such as video projectors, laser printers or kitchen ovens (have to) continue running after the device has been powered off?

It is my understanding that fans are required to bring fresh air for cooling to the device. Without it, the device might overheat. But once the device is powered off, there is no more source for the heat, and therefore no more risk to overheat.

Best Answer

Some devices have hot spots such that the heat needs to continue to be removed for a while after it stops being produced. There is a lag as the heat energy conducts to where the moving air can take it away. If the air no long takes the heat away and the original source stays hot enough for a while, the parts in the airflow would get hotter than during normal operation. This could be bad, especially for devices that are designed at the edge of the ability to remove the heat to avoid damage.

Added about storage to run fan

Some are suggesting that devices that run the fan a while after soft off should have energy storage to guarantee this even if power is suddenly removed. This doesn't make sense except in extreme critical applications, which does not apply to ordinary comsumer products like projectors.

I used a small fan in a product recently to provide forced air flow over a 150 W power supply. It was a Qualtek FAD1-06025BBHW12. The airflow (24 CFM) feels rather wimpy to me compared to what I've felt coming out of projectors, so we can consider keeping this fan powered for a minute to be a conservative estimate of the energy required.

This fan is rated for 1.75 W. That times one minute is 105 J. Let's see how big a capacitor would have to be to deliver that much energy. Let's say it would be a 20 V cap, and we'd have a buck converter running the fan until the cap gets down to 5 V. Let's say the buck converter would be 85% efficient, so the cap has to store 124 J.

The energy stored in a capacitor is:

E = ½ C V2

Working this equation to get a difference of 124 J from 20 V to 5 V shows that it would take 660 mF. That's over half a Farad, which is huge. Not only would that be expensive, but also quite large relative to the size of a ordinary consumer projector. These kind of capacitors usually have 20% tolerance at least, and you don't want to run them at their full voltage to get reasonable lifetime. So you'd need to spec around 825 mF and 25 V. A quick check on Mouser shows that any such cap is going to cost around $100, even in production quantities. That would probably add at least $300 to the end user and make the unit bigger.

Would you pay $300 more for a projector that is a bit more robust in the event of a power failure or you do something stupid like pull the plug out? Keep in mind that a ordinary projector isn't guaranteed to be damaged this way. It's not going to be good for it, and the lifetime of the bulb and possibly the projector will be reduced, but it's probably not going to break outright if this were done once or twice. Most consumers aren't even going to know of this issue, and buy the projector that does the same thing but costs $300 less. Even if they are aware of this issue and actually believe the claims by the manufacturer, most will probably figure they'll be careful and not pull the plug out until the fan stops, and take their chances with a sudden power failure.

So leaving out energy storage to run the fan isn't bad design at all. Burdening the product with expensive features that few understand and even fewer care about, especially in a very cost competitive market, would be bad design. Good design is looking at the whole product, not knee-jerking about a particular issue in isolation.