Electronic – UART Synchronization After Break

dmxdmx512uart

With the assumption that not all UART are create equal and that documentation in general is not sufficient to answer this question, I have to ask how can I tell which UARTs (or generally asynchronous serial receivers) properly handle the Break condition and synchronization afterwards?

A Break condition is an extended period of low (0 Space) input on the RXD line. This generally leads to a Framing Error as the receiver fails to detect a high (1 Marking) state when it expects a Stop Bit.

Some documentation suggests that you can tell the difference between a Break and data corruption due to noise by directly reading the RXD line and checking its state. Of course that is not strictly true because by the time you are alerted to the Framing Error who knows how much time has passed and whether or not you would still be in the Break condition? But that aside…

If you read some descriptions as to the logic behind UART design the UART first searches for the Start Bit. It detects this as a high to low transition (1 -> 0) on the RXD line at which point it adjusts its clocking to sample the stream at mid-bit. Thus it can receive asynchronous data.

One would expect that once a Break condition starts and you have a Framing Error that the UART would then fall into some mode searching for the high to low transition signalling a new Start Bit. In this case you would expect to receive one (1) Framing Error (data 0x00) and then a good byte for whatever follows the break.

Unfortunately, that is not how some (hopefully not all) UART perform. My experience at the moment is that after the Framing Error the receiver continues to read data. It must see the next bit as a Start Bit just because it is low (0 Space) without regard to any transition. You then receive any number of Framing Errors as might be associated with the length of the Break. I realize that this might also be useful.

Even more troublesome is that depending on the length of the break and timing, the high signal level (1 Marking) that follows the Break would be considered to be a valid Stop bit. The end of the Break Condition then generating an extra "good" data byte containing (unpredictably) one of 0xFF, 0xFE, 0xFC, 0xF8, 0xF0, 0xE0, 0xC0, 0x80 or 0x00. Even this cannot be guaranteed as you might actually just get the first byte of the following data.

Now depending on the length of the Marking period after the Break the UART might actually be searching bits in the first byte of data for that elusive Stop Bit. You then lose the first byte of good data or potentially more bytes until the UART finally decides to properly synchronize.

Why does this matter?

Try to read DMX512 data. This is an age old protocol that is very much in use today. Every frame of data starts with a Break. The format is 250 Kbaud, 8 data bits, 2 stop bits and no parity. The Break prefix is a minimum of 92 microseconds and the overall length is not specified. It is highly likely not an exact multiple of data slots. It doesn't even have to be the same frame to frame.

Worse the Mark-After-Break needs to be only a minimum of 12 microseconds.

A Start Code follows which generally should be 0x00 and not some other unique code or something indistinguishable for a Break byte or zero channel data. Following that there can be up to 512 bytes of channel data (0-255). There is no requirement that all 512 channels be transmitted or that it be the same count frame to frame. There is no SOH, no header and no error checking.

In other words, a UART cannot receive this data.

I know that you can handle this with a dedicated device like a PIC wherein you can read the RXD line and measure pulse widths and otherwise enable/disable serial reception and manually synchronize. It would just be a lot easier with a UART that handled the re-synchronization properly. A higher level system cannot read this data.

Does anyone know of a receiver that handles the synchronization better?

It just surprises me that after 50+ years with the highly sophisticated processors we have today no one has bothered to refine this silicon block that they keep tacking onto the chip. Not to mention the stupid legacy digital watch RTC block they all use.

Best Answer

Most uarts handle this just fine, but even if you have some shonky crap that doesn't, just tie the data line from the RS485 RX to a counter/timer trigger input and set the timeout to 90us or so, break detected, then spin in a tight loop until you see the 0->1 transition and reenable the uart during MAB.

Seriously, DMX is EASY, for all that writing a transmitter that works with all the really shonky receivers out there can be a bit of a swine (Hint, you cannot run flat out, and be very conservative with the minimum frame length, the good stuff does fine at full rate and minimum timings, the third tier junk, not so much).