Electronic – Resistor requirement for 3.3v

3.3vledresistors

I have a quick questions: can I safely power a LED with 3.3v without a resistor? I have some LEDs that I typically power with 5v and a 220Ω resistor, but I can only supply 3.3v at the moment.

Additionally I have push-buttons that I typically pull-down with 10kΩ resistors when using 5v. What resistor should I use in this case? Is there a general rule on what resistor should be used given the voltage?

Thank you in advanced for your help!

Best Answer

Quick answer

If you were using to 220 ohms from 5 V and you drop the supply to 3.3 for a 2.1V RED then 5-2.1=2.9 or R reduces to 2.9/220*R=1.2V difference

and R=1.2 /2.9×220= answer

Very accurate method which I use to estimate R and determine the voltage threshold for dim at 10% Imax which is often 10% below the rated forward voltage . Then use the difference voltage between the supply and that threshold Vt to determine the total series resistance.

The LED resistance is I have found inverse to its power rating so a 1/16W is ~ 16 ohms , a 1 W is approximately 1 ohms or less. Thus the added Series R changes with power of the LED.

This may sound complicated but with practice it’s trivial. The total series resistance you would need to drop from 3.3 V just depends on the LED curve ESR plus the series resistance to get exact desired current with reasonable tolerances. If you search my answers in the window at the top of this page you will find I have written dozens of examples on this topic.

Also compute I^2R for high current LEDs

You may learn how to do this or not is your choice.

E.g. in search tab above type or paste user:"Tony Stewart" LED ESR R quote marks are needed due to space in "my name" . I see I have 63 hits for these keywords.

For users wanting to check self, enter "user:me" .... key words

Related Topic