I have a 37 AWG nichrome wire (0.6 mm diameter) with 5 ohm (about 3 m wire) resistance, 12 V DC supply with max 1 A output.
I only measured 1-3 V across the nichrome wire, the wire does warm up but the V across it is too low, the multimeter didn't read any current through the wire for some reason. I tried it with a short nichrome wire and the voltage went 0 V probably because the resistance was too low.
I want the wire to reach 100°C. The datasheet (RD 100 / 0,6) says that needs 2.21 A, according to Ohm's law that's 24 W and with 3 V across it that needs 8A (I = P/V).
How I can increase the V across wire so that less current would be required? Why is the voltage so much less than the supply? I obviously need a supply with higher current rating but with an actual 12 V across wire I would only need 2 A and supplies with lower ampere limit tend to be cheaper.
Best Answer
Correct. And since the resistance is 1.73 Ω/m you need 3.8 V/m of wire.
That's Joule's law, P = VI.
No, R is fixed so you can't just set the voltage to any value like that. At 3 V you get 1/4 the voltage you get at 12 V and that gives 1/4 the current resulting in 1/16 the power.
The way to do that is to draw out the wire so that it has 1/4 the cross-sectional area (1/2 the diameter). The resistance will then be 6.9 Ω/m.
Because you have overloaded it.
There cheaper because they are not as powerful, of course.