So today I was working on a DIY project, that involves me heating up a nichrome wire, and for some reason the nichrome wire is not heating up. I am using a 38 gauge wire, I calculated that I would only need 0.33 AMPS to heat up the wire, using 10inch of nichrome, 12V voltage source, as well as I have read 0.4 amp in my circuit. My circuit consists of the same nichrome wire, measured to be 31 ohms, and a 330 ohm resistor in series; so using current divider, I calculated 0.36 current should be going through the nichrome, but instead the resistor heats up and the nichrome is cold,any idea were I missed up. I can provide photo of my circuit if needed.
Electronic – nichrome wire not heating up
currentheatpower supplyresistors
Related Solutions
Two subjects are covered here. First, reality. Second, the questions posed.
First, Reality : About the most you can get from a U.S. 120 volt plug in is 1500 watts (generally speaking). So your heater element will draw more current (because of lower resistance) until the nichrome element heats up. During this transition time, one depends upon the circuit breaker characteristics to not shut off (trip off). Then, counting upon your forced air cooling (fan), the nichrome element should stabilize (temperature wise) at 9.6 ohms. That is as good as it gets for a constant applied voltage.
Second, your question : "How do I figure out the cold resistance". You need to get the manufacturer to supply you with the expected operating temperature where the resistance is 9.6 ohm ( or rather, at what temperature will the nichrome be at 1500 watts). From there, you can use the nichrome's temperature coefficient to calculate the resistance at room temperature.
There are two related but not totally inter-dependant issues:
Maximum acceptable current able to be carried by a given wire gauge.
Maximum acceptable voltage drop between source and load.
Wire maximum current capability is set by regulations and is based on temperature rise which is a function of energy dissipation per length which is a function of resistance per length which is a function of wire diameter. Other factors which affect allowed regulatory values include sheathing type, application, environment (open air, metal conduit, ...).
Maximum acceptable voltage drop for the load is based on the sum total drop of the circuit components feeding it. An eg 10A load may be fed by a 10A rated conductor until the length of the conductor is such that maximum allowed voltage drop is reached. If you wish to use a longer conductor run you will need a higher rated conductor, say 15A or 20A, not because of conductor current rating per se, but because the heavier conductor allows less total voltage drop.
In your case, as long as total voltage drop on the 10 gauge circuit when fully loaded is below the maximum allowed by regulations, then use of "a few inches" of lighter wire would be acceptable. The voltage drop per length in the lighter conductor will be higher but the additional absolute voltage drop will be minimal.
How light the short length of conductor can be is a matter of regulations and common sense. A few inches of 18 gauge may not burn out even if it is not rated for the current carried, as heat transfer to the device terminals and adjacent thicker conductors may allow lower temperatures than would occur with a longer run. HOWEVER if you have a fire for any reason and investigators determine that you have used an under-rated short link of wire anywhere this may affect insurance payout even if the non-compliant wiring was not the cause.
If the relay is equipped with fixed leads then they are almost certainly rated for the maximum current that the relay can carry. While it is possible to get out of spec equipment, all 'reputable' manufacturers will be well aware of requirements and will meet them. It may well be that the short lengths involved are acceptable for the reasons I mentioned above.
I don't know if your brewery is a home or commercial venture. I also do not know what your local regulations allow wrt wiring of mains powered equipment. It's outside the scope of the question, but you need to be sure that such issues do not affect your insurance coverage - or your chances of needing it.
ADDED
Using Wikipedia link supplied by alexan_e
Calculated values below are rounded but "=" is used.
For say 1 metre loop-length tail, R = 21 mOhm.
At at say 25A = loss of I^2R = 625 x 0.021 = 13 Watts.
That's non trivial.
Voltage drop = IR = 25 * .021 = 0.5V.
10 gauge = 3.3 mOhm/m
or about 2W dissipation
and 0.1V voltage drop.
So extra voltage drop going from 10 gauge to 18 gauge is not very important but the dissipation is significant. 18 Watts in 1 metre loop = 18 Watts in 500mm 2 conductor linear.
That's 3.6 W in a 100mm = 4 inch tail.
Say 4 Watt is not liable to melt insulation, but it will feel warm.
Worst is that the 18 gauge dissipation is > 25% of its conservative fusing value. That's high. Actual dissipation at fusing is about 16x higher, and that fusing value is a very conservative one from Wikipedia's list of alternatives, but I'd aim for a bigger wire diameter for a tail if possible.
Related Topic
- Electronic – Possible to run nichrome at very low voltage
- Electronic – required power supply for nichrome heating element
- Electrical – Current through heating element lower than resistance suggests
- Electrical – Simple wire loop for detecting AC Mains activity
- Electrical – Nichrome wire not heating up in circuit
- Electrical – Copper wire hand heater
- Electrical – calculating the size of a heating resistor
Best Answer
$$V=IR$$
If your Nichrome wire is \$31\Omega\$, and you apply \$12\mathrm{V}\$, then you will have a current of \$387\mathrm{mA}\$ flowing through the wire.
If instead you place a \$330\Omega\$ resistor in series, you now have a resistance of \$330+31=361\Omega\$. So doing the calculation again, you now have only \$33.2\mathrm{mA}\$ flowing through the circuit. The vast majority of the voltage is now dropped over the resistor.
$$P=IV=I^2R$$
You have \$33.2\mathrm{mA}\$ flowing through your circuit, so the Nichrome wire is dissipating \$P=I^2R=33.2^2 \times 31= 34.2\mathrm{mW}\$.
On the other hand, the resistor is dissipating \$P=I^2R=33.2^2 \times 330= 364\mathrm{mW}\$. Basically 10 times as much.
So the question really is, why do you have the resistor?
If you instead place the resistor in parallel with the Nichrome wire, it would do nothing useful. The Nichrome branch of the circuit will still show \$387\mathrm{mA}\$ flowing. You will also have \$36\mathrm{mA}\$ flowing through the parallel branch of the resistor, but this wouldn't change the amount of heat generated in the Nichrome - for all intensive purposes the resistor is essentially a separate circuit sharing the same supply.
It seems that the issue is your supply cannot deliver enough power. The voltage of the supply is basically dropping away meaning that you are only getting \$\approx 160\mathrm{mA}\$ through the wire, less than half of what you need. Noting the square in the Power formula, this means that the power dissipated by the Nichrome is less than 20% of what it would be if the voltage had not dropped away. Try to find a supply which can deliver more power.