Electronic – Decreasing manufacturing variance in analogue designs

designmanufacturingtolerance

I'm designing some modular sensors for my company's internal testing. Most are analogue, with a sensor (PT100, current shunt, pressure transducer etc), some interfacing (REF200 based PT100 circuit for example) and an op-amp (usually an OPA170) to send the 0-10V signal down the wire to the NI based measurement equipment.

I had our first batches of sensors made with 1% resistors, and there's more variance than I'd like on some sensors – for example the PT100 units are +/-3degC over our ten unit prototype run with the same PT100 sensor.

Other than switching to 0.1% resistors, is there anything else I can do to reduce variance between units?

Thanks!

Best Answer

Do a sensitivity analysis of each component in your circuit. Pt100 DIN changes about 0.385% per °C, so a 3°C error represents ~1% overall error. This will also inform your selection of components for stability with temperature, moisture, time and soldering. In this particular case, resistor tempco should certainly be considered for the critical parts. 1% tolerance does not necessarily mean they are extremely stable with temperature.

Errors can come from resistor tolerances, errors in the nominal resistance values when mapped to standard values (can be reduced by using series or parallel arrangements of standard values), amplifier offsets and bias currents, reference voltage error and so on.

A couple of calibration trimpots (span and zero) can cancel out many different sources of errors at once, or more suited to modern times, if you have the numbers in a microcontroller you can apply digital scale and offset values for calibration, and even correct for linearity and ambient temperature effects.