I feel like I am going a little crazy here because the more and more I read about these things the less I understand them.
So I understand that an ideal voltage source has no output impedance, no extra resistance in series between itself and the load, ensuring that nothing else will drop the voltage along the way, meaning the load will get the exact voltage listed.
An ideal current source has infinite output impedance, infinite resistance in parallel between itself and the load, ensuring that the load will get every bit of current as listed (no current will get diverted elsewhere).
I understand that these "ideal" sources are ideal and don't exist in practice. In reality there is always some resistance. What I don't understand is why then are are able to switch between them, or how we can talk about insensitivity to load changes when Ohm's Law says V = IR, if I have a voltage source, is that not fixed? And then if I add a bigger resistor, doesn't that mean less current must flow in turn? Or for a current source, if I change the resistor, am I not then changing the voltage? And since I am able to change the other variable, why are we then able to swap between them? (by swap/switch I mean replacing voltage with current sources and vice versa)
None of it makes any sense to me because I rarely see examples with numbers to illustrate what's going on and how I am supposed to think about these things. Can anyone please give a few examples showing these "ideal" forms, why they are unrealistic, why we are somehow able to switch between them, and what a "real" example might look like?
Best Answer
They do to a point. Voltage regulators maintain constant output voltage up to some maximum current.
No. The voltage regulator will maintain the voltage across a range of current draw. Since \$ \frac {dV}{dI} = 0 \$ over that range of current then its output impedance is zero.
Switch between what?
Yes. So if I connect a 1k load across a 5 V supply I draw 5 mA. If I connect a 100 Ω load across the same supply I draw 50 mA. The voltage remains at 5 V. It is ideal up to the designed current limit of the PSU.
For a constant voltage supply, yes.
Yes, of course.
Figure 1. A random bench PSU image.
The power supply is operating as an ideal current source for loads between 0 Ω and 30 V / 20 mA = 1.5 kΩ. Since the current remains constant over that voltage range we get \$ R = \frac {dV}{dI} = \frac {dV} 0 = \infty \$.
Again, it's not clear what you are asking here.
They're not unrealistic over a certain range of operating conditions. I'm still not clear what you're switching between. I've given an example.
For a fixed resistance you can supply a load with constant voltage or constant current. If the resistance can change then you need to choose one or the other depending on what you want to happen.
In most applications you can't switch a CV supply for a CC one or vice versa. Most circuitry is designed to work on a constant voltage. The national grid (even though it's AC) is designed on this basis. Cars, buses, airplanes, phones, computers and most "electronics" are designed to work on a particular voltage. The most common exception is LED lighting where constant current supplies are used due to the shape of the LED's IV curve.