Electrical – How to take into account dropout voltage of a buck converter to find minimum input voltage required

buckdc/dc converterpower

In DC/DC buck converters datasheet dropout voltage vs output current is given like the graph seen below. To find the worst case minimum input voltage, lets say at 3.3V output and 1A output current, to this converter i would look at the 105C line and see there will be a 1.5V dropout. Therefore if the minimum input voltage for proper operation is 5V than i have to provide a minimun of 6.5V for proper operation at the highest temperature condition. Am interpreting this correctly?

enter image description here

Best Answer

Actually, the ~1.5V dropout voltage refers to the minimum difference between actual input and output voltage, not nominal input voltage and output voltage. Thus you'll need at least \$ 1.5V + 3.3V = 4.8V \$ input voltage at 1A output current.

Similarly, if you want to run your circuit at 105C and draw 3A, you'll need at least \$2.2V + 3.3V = 5.5V \$.

Related Topic