Electronic – Voltage regulator minimum voltage input

voltage-regulator

I have a device (a Raspberry Pi) with the voltage regulator SE8117T33 for 3.3V. From the datasheet it says that the dropout voltage is 1.1V, so the minimum input voltage should be 4.4V, right? I'm a beginner in electronics so I just want to figure out what variables are considered to calculate the ideal input voltage for a voltage regulator. In a tutorial I have the input voltage between 4.75V and 5.25V and I want to understand where these values came from.

Best Answer

Yes, the dropout voltage is the headroom the regulator needs to work with. This is the minimum amount the input voltage must be above the output voltage for the regulator to work properly. So as you say, a 3.3 V regulator with 1.1 V dropout requires at least 4.4 V input.

Stuff happens, so it's usually a good idea to give the regulator a little more room. However, going too far will be less efficient, so there is a tradeoff. The range you quote of 4.75 to 5.25 volts could well be a reasonable tradeoff. Basically, that's a "5 V" supply.

The output current of a linear regulator is its input current (minus the small current on the ground pin, but that is small enough to ignore for the point I'm trying to make). Since the input voltage is higher than the output but the currents are the same, the output has less power. The difference in input power and output power is wasted in the regulator as heat. Another way to look at this is that the regulator will dissipate the difference between the input and output voltages times the current as heat. For example, consider a case where the input is 5.0 V, the output 3.3 V, and the current 500 mA. The voltage accross the regulator is 5.0V - 3.3V = 1.7V. That times 500 mA is 850 mW, which will go into heating the regulator. That would be fine for something like a TO-220 package, but a SOT-23, for example, would burn up at 850 mW.