Electronic – Reduce the impact of gaussian noise on DTMF decoder

fftMATLABnoise

I have a Matlab code to decode digits from a given DTMF audio file (wav format). Considering the input signal to be in the form of x(n)+αv(n) where x(n) is the noise-free signal (i.e. the given DTMF audio file), v(n) is white Gaussian noise and α is a constant controlling how much noise is added, following is the block diagram of my code:

enter image description here

My decoder algorithm is working fairly well but I have been asked to propose ideas to improve my algorithm further. How can I reduce the noise impact? I was thinking to feed my input signal into another block (shown in RED color below) before sending it to decoder.

Block diagram

But I don't know how to achieve that. Can anyone help on this please?

(Please note that I don't own any of Matlab toolboxes installed on my machine. So any suggestion should be implementable with native Matlab commands)

If the decoding algorithm is needed for answering this question, I have posted a copy of it here.

Best Answer

Part of your problem is the FFT.

Since the bins of the FFT won't line up perfectly with the DTMF frequencies, some of the tones won't be detected properly as they will be "smeared" across two or more bins (you don't mention sampling rate or the size of your FFT, so I can't tell for sure what bin sizes you will have.)

You can improve the frequency separation by using longer FFT blocks, but then the delay time mounts up. Longer blocks will help with the noise, too, but if the blocks are too long then short DTMF signals won't be detected (too short.) You can get around that by using overlap in your blocks.

There's a lot of other things you could do, too. Since DTMF decoding is an old and well researched field, you could start by seeing how it has been done in the past. You should also devise a reliable method for determining the reliabilty of your methods. Look up the term "signal to noise ratio."

Old microprocessor based DTMF decoders used the Goertzel algorithm, which is a method for calculating the fourier transform at a specified frequency. Use a handful of them with the proper frequencies, and see what kind of performance you get.

The older DTMF decoders without microprocessors used a bank of filters, then detected the amplitude at the output of each filter.

At any rate, your first step should be to hit the library (or google) and see what has been done in the past.