The typical frequency drift of an astable 555 circuit over its lifetime

555

I am involved in qualifying a sine wave generator that is using a first order LPF and a 555 timer in astable configuration to generate a sine wave(-ish) output. I would have designed it differently, but we already made the board, so I have to work with the design the best I can. The designer has placed the corner frequency of the first order LPF at 100kHz and the oscillator is running at 230kHz. A variable gain stage (not an op-amp, but similar, with 7M ohm input resistance) calibrated with a variable resistor is being used to set the final gain, so output amplitude from the filter doesn't matter except that it must remain constant. The output frequency isn't critical, but the peak output voltage is.

Because the oscillator is running in a region of the filter where gain changes -20dB/decade, the frequency has an impact on the gain. I was hoping someone could tell me how much 555 timers change in frequency from use to use to help me assess how much trouble this design is in. All components are surface mount. I know capacitors and resistors change with temperature; the device should always be within a 15 degrees Fahrenheit range when in use. Also, I am unfamiliar with how much surface mount capacitor capacitance changes over long periods of time.

Best Answer

555 has temperature sensitivity of ~50ppm/C. Delta of 15°F is about 8°C. This gives you 400ppm.

So your 230kHz clock could be off by 92 Hertz over temperature due to 555 tempco.

If you use capacitors with NP0 dielectric you most likely get very decent temperature and age stability, but this limits you to capacitances <500pF. Vanilla resistors though could have tempco of -500ppm/C, which will shift your frequency by ~920 Hertz.

Aging of the chip itself likely to give much weaker effect.

So if 1kHz frequency error is ok for you - you are good to go (with NP0 capacitor!).

PS. Check tempco of your LPF too ;-)