Electrical – Voltage across resistor keeps the same as actual voltage

resistors

Im a very beginner trying to learn some electronics basics. I am trying with resistors. I connected a resistor to a 12v power supply in series and tested the voltage using multimeter. But it keeps the same as the actual volage. No voltage drop shows with or without resistor. What could be the reason?

Best Answer

This is the circuit you described in the question (this site has a built in schematic editor, it's a lot better than trying to describe a circuit using words):

schematic

simulate this circuit – Schematic created using CircuitLab

A resistor in the power supply negative terminal and then a meter from the resistor to the power supply positive. So the voltmeter and the resistor are in series. This isn't a normal situation, normally a voltmeter is placed in parallel with the voltage to be measured.

As a first approximation the voltmeter has an infinite resistance. So your total series resistance is therefor infinity+R1 = infinity.

Ohms law states that V = I * R. For the circuit as a whole: V=12, R = infinity so I = 12 / infinity = 0

For the resistor V = I * R, I = 0 (current must be the same at all points in a series circuit) which means that V = 0.

So the voltage drop across the resistor is 0.

If we have 12 V total with 0 V across the resistor the voltage across the meter (the number it will display) will be 12 - 0 = 12 V

In reality the meter will have a finite resistance but it will be in the M Ohm range however for most values of R1 this will be close enough to be considered infinite. If you changed R1 to be close to the value of the meter, something in the 5-10 MOhm range, then the voltage your meter is measuring will start to drop.

Related Topic