Electronic – How to decode morse code with digital logic


I have been curious as to how one might implement a morse code decoder with basic digital logic (no microcontroller), mostly as an exercise. I could decode individual letters with a state machine, if I had asynchronous inputs such as dot, dash, and pause. My problem is essentially in generating those outputs from a raw source like a telegraph key.

Is there a good circuit that can detect when a signal goes high for a certain amount of time? I would need it to send one signal when it is up for a short time and then goes low, and another signal when it stays up longer (three time periods for a dash vs one for a dot). I also need a signal when it stays low for three periods for a letter break, or around seven units for a word break. It needs to be fairly liberal with its timings, as it is intended to decode manual entry, not computer modulation.

It would be super nice if it used minimal components and if I could use a potentiometer to adjust the time period.

Best Answer

There were some hobbyist experiments (in the early '80s, I believe) with decoding variable speed digital data with the intent of being able to distribute code to accompany magazine articles by printing it as bar-codes. The reader (person, not machine) could then scan it into their machine with a hand-held scanner. It was assumed impractical (until shown otherwise) due one's inability to hand-scan at a uniform speed.

The solution ended up being for the decoder program to initially collect enough white-black to black-white transition times to discover the mean wide-bar and narrow-bar times, assign '1's and '0's, respectively, to the collected data, and continue decoding the incoming stream while simultaneously updating the wide-bar and narrow-bar mean times to account for changes in the wand speed over the bars.

The same technique was applied to decoding hand-sent Morse off the air, with a simple circuit going high and low with the receiver's audio output, fed to a similar algorithm.

Your hardware decoder would need to be similarly adaptive.

As an aside, an interesting issue came up when the words 'T5' or '6E' appeared frequently in the decoded text. Operators naturally developed keying habits on common words, and would key (and understand) f/ex, the word 'the',

Dah - dit dit dit dit - dit



which the decoding algorithm, doing its best with slightly uneven element spacing and a result string that didn't match any known Morse character, got one of the above digraphs.

You might find some of the articles in early issues of Byte Magazine at the library.