Electronic – Calculating parallel resistance of a rated bulb

lampresistors

I have a bulb rated 110 V, 60 W and is in series with another bulb which is 110 V, 110 W. It's being powered with a 220 V source. Now, what would be the resistance of the resistor to be added in parallel to the first bulb so that each bulb will get the rated power?

The way I approached this is first get the resistance of each bulb, then get the voltage drop of each bulb via current divider principle. After that, I'm stuck.

enter image description here

Best Answer

As Steven states, this is only true when the bulbs act like ordinary resistors.

The solution is easy. The voltage across the 'divider' will be evenly distributed when power at the top half and power at the bottom half are equal.

  • Power at the top half is 60W.
  • Power at the lower half is 110W.

To have equal power both at top and at bottom halves, you have to add an extra \$110W - 60W = 50W\$ in parallel to the existing top bulb.

Ohms law:

\$R = \dfrac{U}{I}\$

and

\$I = \dfrac{P}{U}\$

Substituting the second equation into the first, gives us the familiar: \$R = \dfrac{U^2}{P}\$

Now fill in the details:

\$R = \dfrac{U^2}{P} = \dfrac{(110 V)^2}{50W} = 242 \Omega\$