Im a very beginner trying to learn some electronics basics. I am trying with resistors. I connected a resistor to a 12v power supply in series and tested the voltage using multimeter. But it keeps the same as the actual volage. No voltage drop shows with or without resistor. What could be the reason?
Electrical – Voltage across resistor keeps the same as actual voltage
resistors
Related Solutions
Thermistors are only a type of resistor in a limited sense. Within a given temp. range they behave linearly in current. The packaging of Thermistors varies quite a lot, Glass beads to plastic packs. As far as I know there is no color coding that is common. I did a check on Omega and Honeywell's site. No mention of a color coding there either. I would think the 18p is a manufacturing code and not an electrical spec.
Also, testing with 30V!! How big are these things!
The humidity sensor in the question operates on an AC signal of up to 1 Volt RMS. The datasheet specifically mentions an operating frequency range of 0.5 to 2 KHz, as well.
If the sensor is operated with DC supply, one electrode (the negative one, if I remember correctly) will deteriorate rapidly due to ion migration towards one plate in preference to the other, rendering the part inoperative.
Now, regarding a suitable operating mechanism:
The impedance curve of the device spans a range of anywhere from 1 to 10 Megaohm at 20% relative humidity, down to between 1 and 5 kΩ at 90% RH. The impedance table in the datasheet specifies values from 1.1 kΩ to 7.2 mΩ, too broad a span for a voltage divider to work.
Calculating current through the device for 1 Volt across it, spanning this impedance range to worst-case limits: At 90% RH, for 1 kΩ, I = 100 μA At 20% RH, for 10 MΩ, I = 100 nA
Thus, a very low impedance (100 Ohm or less) AC voltage source would be needed, to drive this sensor suitably, if it is to be operated in voltage driven mode. This shows that a voltage divider would be a very inefficient, and somewhat ineffective, way of driving the sensor.
Instead, a more viable approach would be to drive the sensor using a current source, with the DC blocked using a suitably large capacitor.
There are several current source circuits out there, using bijunction transistors, FETs, or op-amps. Pick one that suits your purpose and budget, gate the current with an input from one pin of your microcontroller being toggled at say 1 KHz, and read the voltage across the sensor using an ADC pin of the MCU.
Note that this will not give very precise results, as such electrode-based humidity sensors are characterized using a bipolar sine wave. Improvements to the solution could include using an RC or LC filter to bypass the higher harmonics of the 1 KHz signal, leaving an approximation of a 1 KHz sine wave.
Actually designing such an AC (near)sine wave, stiff current source is left as an exercise to others less preoccupied than me.
Best Answer
This is the circuit you described in the question (this site has a built in schematic editor, it's a lot better than trying to describe a circuit using words):
simulate this circuit – Schematic created using CircuitLab
A resistor in the power supply negative terminal and then a meter from the resistor to the power supply positive. So the voltmeter and the resistor are in series. This isn't a normal situation, normally a voltmeter is placed in parallel with the voltage to be measured.
As a first approximation the voltmeter has an infinite resistance. So your total series resistance is therefor infinity+R1 = infinity.
Ohms law states that V = I * R. For the circuit as a whole: V=12, R = infinity so I = 12 / infinity = 0
For the resistor V = I * R, I = 0 (current must be the same at all points in a series circuit) which means that V = 0.
So the voltage drop across the resistor is 0.
If we have 12 V total with 0 V across the resistor the voltage across the meter (the number it will display) will be 12 - 0 = 12 V
In reality the meter will have a finite resistance but it will be in the M Ohm range however for most values of R1 this will be close enough to be considered infinite. If you changed R1 to be close to the value of the meter, something in the 5-10 MOhm range, then the voltage your meter is measuring will start to drop.