Electrical – Understanding the resistor method of slowing server fan speeds

fanvoltage divider

Researching how to compensate for the overly aggressive fan speed control in our HP DL280e Gen 8 server, I have seen several recommendations for how to slow the system fans using a resistor in series with the 12v power input, for instance over on server fault.

What I don't understand is how their suggested solutions work.

1W 10Ω resistor in series with a 12V 2.5A fan

Naively using ohms law, suggests that at full speed this fan has a resistance of 4.8Ω so adding a 10Ω resistor in series should drop the overall current to 0.81A (9.7W rather than 30W) with 6.6W through / 8.1V across the resistor and 3.2W through / 3.9V across the fan.

What I don't understand is how this works without blowing the resistor, putting 6W through a 1W resistor doesn't seem like a great idea. So I wonder what I'm missing. Also, although I can't find a data sheet for the V60E12BS2CB5-08 fan shown, I can't imagine it being happy with only 3.9V across it's 12v line.

10W 11Ω resistor network in series with a 12V 2.3A fan

Again, ohms law suggests that at full speed this fan has a resistance of 5.2Ω so adding an 11Ω resistor in series should drop the overall current to 0.74A (8.9W rather than 28W) with 6W through / 8.1V across the resistor and 2.9W through / 3.852V across the fan.

The power requirements are now within tolerance, but the spec sheet for the fan (Delta PFR0612XHE) says that the operation voltage is 7.0 to 13.2v, so it feels to me unlikely that it would work at 3.852V.

7W 3.7Ω resistor network in series with a 12V 2.3A fan

Am I right in thinking that the ideal resistor for minimum speed (7V) with the Delta PFR0612XHE at 100% PWM would be 3.7ohm?

At full speed with a 5.2Ω fan, a 3.7Ω resistor in series should, according to my naive assumptions, drop the overall current to 1.35A (16.1W rather than 28W) with 6.75W through / 4.7V across the resistor and 9.45W through / 7.0V across the fan. That's 34% of the original power to the fan.

The power requirements and supply voltage would now both be within tolerance.

Is my naive understanding even vaguely correct?

What about driving at less than 100% PWM?

So, if a 100% demand now results in a 34% power draw, I assume that it's power draw would scale proportionately with PWM demand. If the fan is driven at less than 100% PWM, it's effective resistance will go up as the current draw goes down. Re-running the above calculations, I get:

Running the fan at 43%, i.e. 1A, it would appear to be 12Ω so adding a 3.7Ω resistor, the overall current would drop to 0.76A (9.2W rather than 28W) with 2W through / 2.8V across the resistor and 7W through / 9.2V across the fan. Thus a demand of 43% results in a 25% power draw.

Finally the fan at 15%, i.e. 0.345A, it would appear to be 34.8Ω so adding a 3.7Ω resistor, the overall current would drop to 0.31A (3.8W rather than 28W) with 0.4W through / 1.2V across the resistor and 3.4W through / 10.8V across the fan. Thus a demand of 15% results in a 12% power draw.

If this is correct, I assume this is the reason why I've seen complaints that the resistor method results in fans shutting down when the PWM demand gets too high (presumably as the rise in current drops the voltage across the fan to below the 7V lower threshold) and that going no higher than a 3.7Ω resistor would prevent this issue.

Other research

My specific fans

Note that these fans don't have a rotation pulse out like consumer PWM fans, they only have a locked fan signal. This signal isn't even mentioned in the datasheet, but people have found that it is 0v when the fan is spinning, and non zero when not spinning (people ground it to bypass the failed fan detection).

The fan takes a 3.3v inverted PWM in, so a constant 0v or 3.3v FG signal will fail safe, with the fans running at 100%, while increased duty cycle on the PWM will slow down the fan.

Combined, these mean that the motherboard only knows if a fan is spinning or not, and it assumes that if it is asking for 50% and it is spinning, then it is spinning at 50%.

Evidence that naive voltage divider calculations aren't useful.

Most of the videos and forum posts about this suggest resistor values without giving any details of the fans they apply to or the precise effects they have. How to make a rack server quieter by putting resistors on the fans, by ElectronicsWizardry explicitly tests the voltages across the van though and mentions them in the the video.

Starting with a 12V 1.54A fan, a 22Ω resistor resulted in 4.9v across the fan (but 150°C resistor *8') while 2x 51Ω resistors in parallel (25.5Ω) resulted in 4.7V measured across the fan.

These are obviously much higher voltages than the naive voltage divider calculations suggest: 22ohm suggests 3.139V not 4.9V, while 25.5ohm suggests 2.8V not 4.7V. That means that the resiatances of the fan must also be much higher.

Summary & specific questions

I've done a lot of calculations to try and work out if this is a viable method to reduce the aggressiveness of our server fans, but are my calculations are even vaguely correct?
* What am I missing?
* And can I calculate the resistance required for a motor given it's specification, assuming it will sometimes need to run at 100% PWM?


At power up, iLO (the embeded management computer) runs the fans at 100% during self test (5x 68dB-A fans are really loud) and then drops them down to 45% PWM when idle (still too loud in an adjacent room). We know that it should be running the fans at around 15% with only two drives populated, but iLo intentionally ramps up fan speeds with non HP drives on Gen8 servers.

Best Answer

The important aspect that you're missing is that fans are not resistors, so a 12V 2A fan will not sink 1A at 6V. In fact, if the motor is able to achieve the target RPM at 6V and has a controller which generates the PWM duty cycle to achieve the target speed, the motor will still consume around 24 W, sinking 4A of current.

Are you sure the PWM duty cycle remains constant when a fan gets a series resistor? If the fan has an RPM output, I would expect the speed controller to detect the slowdown and compensate by increasing the duty cycle, which would explain why larger that expected resistor values are needed. The same would happen if the motor controller generates the actual duty cycle internally.

A motor with 7-13V operation voltage should be able to reach the specified RPM in the specified voltage range. If the voltage goes below 7V, it will not immediately stall but gradually slow down. At 3.5V most 12V fans will still rotate, albeit very slowly, and some may require a push to start.

It's really hard to say how much power a particular fan is getting with a series resistor without actual measurements (e.g. the actual voltage drop on the resistor).

Regarding 1W resistors dissipating 6W, it's indeed a bad idea, but if the resistor is getting its share of the airflow from the fan, it may well be able to dissipate 6 times its nominal power.

PS. I had a very loud fan in an appliance once. Getting one of those tunable DC-DC buck converters worked like a charm: there's no waste heat from it (running the fan at half the supply voltage results in ~10% of losses in a buck converter wrt motor power, vs. 100% losses on a resistor), the dependency between output voltage and RPM is much more linear than in case of a series resistor, there's no stall even at very low voltages, and it can be conveniently tuned with a screwdriver, no re-soldering required.