Electrical – Why the voltage in a circuit changes when the battery has a resistance but not when the battery doesn’t has a resistance

resistancevoltagevoltmeter

I was learning the basics of electronics with an interactive simulation of a DC circuit on phet.colorado.edu and I notice this weird pattern.

enter image description here

And also, in real life, with a real battery, what rule applies? Does the voltage between the two sides of a battery also equals the voltage of the battery or does it depends on the resistance on the circuit?

Best Answer

To first approximation, you can think of a battery as a fixed voltage source with a resistance in series. Your overall circuit therefore looks like this:

][1]

R1 and R2 form a voltage divider so that the voltage you see coming out of the battery is less than the battery's internal voltage.

For good batteries properly suited to the application, R1 is "low" compared to the effective load resistance (R2). The voltage drop is therefore usually small.

In your second example above, R1 is 9 Ω That's significant when the load is 40 Ω. In fact, with those two values, the voltage divider will give you (40 Ω)/(9 Ω + 40 Ω) = 82% of the battery's actual internal voltage.

If you just connect a voltmeter to the battery, then R2 is very large, like 10 MΩ, so you won't notice the tiny voltage drop due to R1 being in series. In fact, this is how you measure the battery's internal voltage (determine V1 in the schematic above).

Batteries are complicated, and in reality things are not this simple. The biggest gotcha is that V1 also varies with a number of conditions, like temperature, discharge state of the battery, immediate previous history, etc. But, the above is a useful enough first approximation to explain what you saw.