Electrical – Voltage signal and Current Signal

currentelectricalelectricityvoltagevoltage measurement

Please give an intuitive explanation of difference between how a signal or data is carried using voltage and current?

I'm wondering even though current and voltage inevitability co-exist, why do we use the term for one concept(current or voltage)? If there is a signal out there somewhere, it is both current and voltage at the same time. But we name only one of its property.

We see in some circuits like heart rate signals or some other small signals which have to be in form of voltage in order to transfer the whole signal it to an amplifier (which has high input impedance). Even if it is voltage signal, we still have current passed to the amplifier.Why it is mentioned that voltage is transferred to input of the amplifier and why not current (when voltage causes current to pass)?

Best Answer

Usually it has to do with what the author thinks is the best way to sense the signal that's on that pair of wires. If they say "voltage signal" then they (probably) feel that it should be applied to a high-impedance amplifier that responds to voltage; if they say "current signal" then they feel that it should be applied to a low-impedance amplifier that responds to current.

An example of a current signal is a 4-20mA current signaling loop. A device that uses this expects to be fed a wide range and possibly varying voltage, and it will impose a current proportional to the quantity it's measuring on the power supply.

An example of a voltage signal would be the output of a typical op-amp. The op-amp imposes a voltage which remains largely unchanged regardless of variations on the current, at least up to the point where the amplifier can no longer coerce the output to be correct.