Thermistor temperature dispersion

powerresistors

I am designing a thermistor circuit to determine if a nozzle is becoming plugged or flowing freely. My idea is to use a low resistance thermistor and resistor to form a voltage divider that when powered would become hot, and immerse that into the stream of water. The water flowing by with the nozzle unplugged would keep the thermistor cooled to a certain temperature and when the nozzle started to become plugged the flow past the thermistor would decrease and allow the temperature to rise.

I am looking into designing the circuit and buying all of the parts but I have absolutely no heat transfer experience and don't know how sensitive the circuit needs to be. To have a low enough resistance so the circuit actually heats up I am planning on using a 33 ohm bias resistor and a 33 ohm thermistor.

So my question is how do I calculate what voltage I would need to feed into the voltage divider to heat the circuit up to say 100 degrees C, and how do I calculate say how much heat there would be dissipated if the thermistor at 100 degrees C was immersed into a 3/8" diameter PVC pipe with a flow of .5gpm?

Best Answer

It is going to be difficult to calculate the voltage/wattage necessary for this. I suppose it could be done, but why not do it by the "seat of your pants"?

My unscientific opinion says you would need on the order of 100+watts to raise the temperature to 100C at any miniscule water flow, and you probably won't find a thermistor rated anywhere close.

I propose to adjust your method as follows. Use two thermistors, one set up with very low self heating as a reference (placed upstream), and one similar to what you propose with 33 ohms, which at 10V would equal 0.75W, which should be enough self heating to do what you want. Be sure to calculate all temp ranges (including accidental open air possibility)so you don't go over the watt rating of your chosen thermistor.

You may be better off using a PTC for this application, it could self limit current.