Electronic – Why isn’t maximum current forced upon a circuit by a power source

amperagepower supplyvoltage

I asked a similar question here about this, but I still don't quite understand.

Say I grab onto two wires from a standard 120V 20A wall socket. I would fry. I would have 20A at 120 forced on my body.

But, when speaking in terms of circuits, as in my linked question, if I have a power supply outputting 100mA maximum at 5v, for some reason my device only gets exactly what it needs? Why is this?

Most things flow from an area of high concentration to low concentration, taking the path of least resistence. If you connect two wires to a power supply, why don't the maximum number of electrons want to go through your circuit?

If current behaves this way…why do you have to regulate voltage?

Best Answer

The "120 V 20 A" rating of a wall outlet means it will supply 120 V and up to 20 A.

What is implicit in this spec, and most power supply specs, is that the power supply is considered a voltage source. That means it will try to keep its output voltage constant. It only has this single degree of freedom. The load then decides how much current to draw at that voltage.

For example, let's say your overall resistance (mostly due to relatively dry skin where the current enters and leaves your body) is 10 kΩ. You grab the two wires from the 120 V outlet and (120 V)/(10 kΩ) = 12 mA will flow thru you. That's way way less than 20 A, but still enough to kill you.

There are power supplies that regulate the current. These are unusual, and will be clearly labeled as such. A constant current supply might be labeled 1 A 50 V. That means it will put out 1 A of current, but can only go up to 50 V. If it would take more than 50 V to get 1 A of current thru the load, the output will sit at 50 V and the current will be below 1 A.