Electronic – How to choose correct resistor with proper wattage and ohms for 6 leds connected in a serial circuit

ohms-lawpower-dissipationresistors

Let's suppose I've six LED lights with each 3.2 Volt and 20mA and all six of them are connected in one serial circuit. I know there is better setup but for this I'm using this circuit setup.

I'm using a single power source with two 12volt A23 batteries kept inside a 2 battery case and a 240 ohm resistor is also present in between power source and the first LED light.

I calculated the resistance for the resistor like this using Ohm's law:

$$
R = \frac{\Delta V}{I} = \frac{24\,V – (3.2 V\, * 6\, \text{LEDs})}{ \frac{20}{1000}A} = 240\, \Omega
$$

Power dissipated by the resistor:

$$
\text{Power dissipated by resistor} = (24\, V – (3.2\,V * 6\, \text{LEDs})) * \left(\frac{20}{1000}\right) \text{A} = 0.096 \text{ Watt}
$$

So should I buy 1/10 watt and 240 Ohm resistor for this circuit?

If my calculations are totally wrong above, can anyone kindly tell me or better if can be shown the correct calculation and the proper resistor with correct ohm and watt?

Best Answer

Your calculations are correct. I would give the resistor somewhat headroom, power-wise: if the LEDs' voltage is a bit less than 3.2 V, like 3.1 V (the voltage can vary by much more than that!) the current will rise to 22.5 mA, and the voltage drop across the resistor to 5.4 V. Then the power will be 120 mW, so the 100 mW won't do. Overmore, 1/4 W is a more standard value, and probably cheaper.

So go for a 240 Ω/0.25 W resistor.