Electrical – Power consumed by light bulbs in series, parallel

accircuit analysis

schematic

simulate this circuit – Schematic created using CircuitLab

So the problem I've been given is to determine whether the power consumed by a string of lights increases or decreases as more lights are added in parallel (and a similar problem concerning increasing number of bulbs in series). The problem states that the source is a standard wall electrical outlet (I'll assume it's like the ones in the US since that's where I'm at).

Now, I don't know a whole lot about electrical standards anywhere in the world, so I'm not sure what exactly comes out of a wall, in terms of electricity and current. My understanding is that the current alternates (which doesn't really make sense to me because my light bulbs aren't flickering on and off like a sine wave). If resistance remains constant, that means the voltage is alternating too, since \$V = I \times R\$, so the value of \$V\$ is fluctuating proportionally with \$I\$

I was under the impression that, at least in the US, wall outlets had something like 120V and the current actually drawn is variable depending on your load. I thought that since the current is alternating, then as discussed above, the voltage must be alternating too, and maybe the 120V refers to the amplitude of the alternating voltage. Please correct me if I'm wrong, as it seems crucial to understanding the problem.

So if you add light bulbs (resistance) in parallel, the overall resistance of the load decreases. If your max voltage remains constant, then more current would be drawn, no? Similarly for bulbs in series, the total resistance increases as bulbs are added, so the drawn current would be reduced. So for parallel bulbs, your power would increases, since \$P = I^2 \times R \$ and the \$I\$ term is increasing faster than the \$R\$ term is decreasing. Likewise for series, the power seems like it would decreases, since the \$I\$ term decreases faster than the \$R\$ is increasing.

How far off am I?

Best Answer

You're right in thinking that current and voltage varies according to time in an alternating current circuit. However, when analysing circuits with alternating current and simple resistive loads, it is much easier to work with average power instead of instantaneous power. You can use RMS voltages and RMS currents to simplify your analysis.

The 120V voltage rating you get out of a wall outlet is given in RMS. You can multiply this number by \$\sqrt2\$ to obtain the amplitude of the sine wave.

Your circuit analysis does seem right on the most part. Assuming your input voltage remains constant, adding resistances in parallel decreases overall resistance and implies increased power. Adding resistances in series increases overall resistances and implies decreased overall power.

If you rearrange for the voltage, you can see how the resistance \$R\$ of the circuit affects the power consumption of the circuit.

\$P = \frac{V_{rms}^2}{R} = \frac{120^2}{R} = \frac{1.44\times10^4}{R}W\$

Lastly, there's a few reasons why you might not observe the lightbulb to flicker when current changes direction. One of which is that a light bulb doesn't dim instantly when power goes to zero (try switching off the lamp and you'll see the filament fades away slowly) so it still produces light as the instantaneous power momentarily crosses over zero. Furthermore, 50/60Hz is generally fast enough that the flickering is indiscernible by humans.