Noob trying to select resistor for 3 LEDs

ledparallelresistors

So I'm making a parallel LED circuit using THREE CREE x-lamp LEDs which typically run at ~ 3.1v @ 350 mA, but they can max out at 1000 mA. I'm putting a resistor for each LED.

So my logic may be flawed here but this is how I'm thinking (sorry if I seem stupid):

My power source is 12v and 1 amp, which means if I use three equal resistors, each LED will draw .333 mA, close to the typical value.

Calculation of resistor: R = V/I —— R = (12-3.1) / .333 (3) = 8.9 ohms (I believe I'm supposed to multiple the denominator by amount of lights I have, correct? So I would use a 10 ohm resistor.

Wattage of resistor: P = VI ——- P = 8.9 ohms * 1 amp = 8.9 watts. So I would use a 10 ohm, 10 watt resistor.

Is this correct?

Also, say I had only one LED. would it draw that maximum amount of amps from the power source, so 1 amp?

Sorry, I am really new to this stuff. I don't know if I'm doing the correct calculations or not.

Best Answer

Your calculations for both the resistor value and wattage are off.

For the resistor, you show the formula R = V / I, but then calculate using the entire current for all three LEDs (1 A) instead of the current for just one LED.

So it should be

$$R = \frac{V}{I} = \frac{(12 - 3.1)}{.333} = 26.7\space Ω$$

For the wattage, you are trying to use the formula P = VI, but then you substitute R for the V and use the current for all three resistors (1 A) instead of just one.

So the correct calculation would be

$$(12 - 3.1) * .333 = 2.96\space W$$

or you could use:

$$P = I^{2} R = (.333)^{2} * 26.7 = 2.96\space W$$

So you could use a 27 ohm, 5 W resistor.

Related Topic