Electronic – voltage source vs. current source

current-sourcevoltage

I am just not sure that why in development labs, current sources are not as popular as voltage sources.

I have worked in different places, and they all have been using voltage sources.

I just want to find out why, in practice, voltage sources are much more popular than current sources. And, if there are applications that current sources are actually being used instead.
And what are the main advantages/disadvantages of using one vs. the other.

Best Answer

There are no advantages or disadvantages. Some devices require constant voltage. Some devices require constant current.

Majority of the DUTs** require constant voltage. That's why most of the bench top power supplies, which you have encountered in the labs, are constant voltage. Keep in mind that a descent lab power supplies have and adjustable max current threshold and 2 modes: constant voltage (CV) and constant current (CC). CC mode is activated automatically, when the DUT tries to pull more than the max current. Often, it's a sign of a problem, when a DUT wants to pull more current than expected.

Few DUTs require constant current. LEDs, for example. Often, a constant voltage supply powers a separate constant current circuit, which in turn powers a constant current DUT.

** DUT - device under test