Electronic – Why does FTDI RS485 signal look wrong

ftdirs485

I have an in-house board with a microcontroller (Microchip dsPIC33E) and an RS485 driver (Analog Devices ADM2587E) that I'm trying to connect to a desktop PC using an FTDI (USB-RS485-WE-1800-BT) USB to RS485 converter cable. The in-house board receives data packets from the FTDI correctly and responds. However the PC will frequently, usually within the first fives bytes, but sometimes later and sometimes sooner, see bit errors in the data received by the FTDI.

The picture below is a typical trace captured from an oscilloscope, showing:

  • The DE input to the ADM2586E in yellow (1)
  • The Data- (B) signal of the FTDI in cyan (2)
  • The Data+ (A) signal of the FTDI in purple (3)
  • The GND signal of both the FTDI and ADM258E in green (4)

The scope has been set so that the '0 V' level of signals 2, 3 and 4 are all aligned and marked by the position of signal 4 – the green line.

Trace of RS485 signals (Modbus Request Server ID function and response)

The first burst of data (on the left) is a Modbus Request Server ID packet, sent at 19200 bps, 8 data bits, 1 stop bit and even parity. This is recognised by the in-house board, as seen by it asserting the Driver Enable of the ADM2587E and responding with the second burst of data (on the right). The FTDI sends USB data that includes a bit error in the 6th (in this instance) received byte. Note: I cannot be certain of which byte contains the bit error, as the USB driver does not retain very much in the way of timing information, but this is not particularly relevant to my question anyway. Also, the delays either side of the DE transitions have been deliberately inserted in the in-house board software; I originally used the UART RTS simplex mode to reduce them to the width of a single data bit, but this had the same effect.

I suspect that the problem is due to the voltage levels of the received signals being misinterpreted by the FTDI. The in-house board includes a 360 Ohm fail-safe bias resistor between A+ and VOUT and between A- and GND2 on the ADM2587E, plus a 120 Ohm terminating resistor between A+ and B- lines, as per the Analog Devices application note for this RS485 driver IC. (I've also tried 1 kOhm and 10k Ohm bias resistors, but with very little difference.) The trace above was taken with a Keterex USB Isolator between the PC and the FTDI, although this makes no difference.

The output from the in-house board is as I would expect to see a half-duplex, two-wire RS485 signal, with the B- and A+ signals balanced around the mean of their idle levels. The output from the FTDI looks odd to me, with the B- and A+ signals both elevated relative to the mean of the idle levels. If the FTDI is expecting the received data to conform to this 'elevated' signal, that would explain the bit errors and leads me to my question.

Why does the output of the FTDI appear to be elevated, relative to both the ground and idle mean voltage levels?

Edit

The answer marked as correct below indicates that there is nothing wrong with the FTDI levels, which is the closest thing to a correct answer: it doesn't matter that they are elevated since they are within spec.

As an aside, the problem I'm seeing actually appears, on much closer examination of the scope traces, to be due to the output from the ADM258E, and nothing to do with the FTDI at all.

Best Answer

You're running in half duplex mode, joining Y to A, Z to B right? Why do you mention A- and B+ and A- and A+? I'll assume you just mean A and B, and that the termination resistor is across them.

Each receiver at the end of the 485 wires require 120ohm termination resistors, so in half duplex mode you need one on either end since both are receivers. Do you have one on the FTDI end too?

485 transmits 1/0s by separating the pairs by a voltage, why do you want 'failsafe' resistors that will tie up the bus? In an idle state the voltages should be the same between the two, which is what the 120ohm resistor also helps to do, you're working against this by pulling the separate lines apart and wasting power. You have 360+360+120ohm from VOUT to GND permanently.

What should the polarity of the stop bit be? Check the TX pin polarity of the UART and make sure its idle is 0 during the delay period(s). If it's transmitting a 1 it could be erroring because the FTDI receiver is never seeing the stop bit?(see update below)

The voltage levels themselves don't matter so much as there is a >200mV difference between A/B. The FTDI chip should have no problems determining what you're sending with a 3v difference, just as your receiver did with a ~2v difference.

Update: Since realizing that you are able to actually receive correctly the first few bytes before the bit error, this looks more like a timing issue. Try run at a lower baud rate and see if you get any more received data before the error.