Electronic – Why does an infinitely decreasing resistance infinitely increase current


So I have been mulling over Ohm's law for hours today. It makes sense to me that the current gets bigger as voltage rises, since the voltage is supplying more electrons and current is the number of electrons flowing. However, I'm confused by the theory that as resistance drops, current increases, even when voltage stays constant. Such as:

$$V/R = I$$
$$\frac{1V}{0.0001 \Omega} = 10,000,000 mA!!$$

That's a big current from such a small voltage!

My mental picture of a current is more electrons flowing. Electrons are supplied by the voltage, right? If we start off with a small voltage, say 1V, but infinitely decrease the resistance, we'll get a bigger and bigger current. How can this be? Isn't current the number of electrons flowing? With fewer electrons (because of less voltage), how can less resistance infinitely increase the current?

Best Answer

Your basic misconception is that voltage is "supplying the electrons". The electrons (or whatever the charge carriers are in whatever material you are using) are always there. Voltage is the push to get them moving at the macro level. It is this motion that we call current.

Current therefore is a function of two things, how hard you push on the charge carriers and how much the material resists the movement of these carriers. Double the voltage, and you get double the current at the same resistance. Halve the resistance, and you get double the current at the same applied voltage.

In theory, the current would go infinite as the resistance becomes zero. In practise, the voltage source won't be able to support more than some current before it can no longer provide the voltage.