Three lights are connected in parallel across a 120V source. If one of the light burns out, then remaining two will light,
a. will not light
b. with the same brightness
c. brighter
d. dimmer
If one light burns out , overall load R will increase(since three light were in parallel) which should lead decrease in current & hence other two lights will be dimmed but the ans is 'same brightness'. What's wrong with my logic & why exactly we can't apply this logic?
OR
this is completely related to Electromagnetic theory for light where brightness is related to applied voltage to accelerate electrons which emits light when falls from Higher energy level to lower level?
Best Answer
Each bulb runs an independent current loop to a constant voltage source. If one is off, it does not affect the brightness or voltage or current of any other loop.
Each loop current is limited by the effective resistance of the hot bulb.
But the total of all 3 loop currents is reduced 1/3 for the one burnt out bulb.
The bulb stays on there as long as the constant voltage is applied.
You can run some experiments and try to prove Arhenius effects by using a light dimmer and temperature probe and see if you can double the life by reducing the filament temperature by 10 deg C. This doesn't mean if you drop the temp by 1000'C it will last last for 2^100th times more hours because other defect rates creep in.
Now how many engineers does take to change a light bulb?