You're running in half duplex mode, joining Y to A, Z to B right? Why do you mention A- and B+ and A- and A+? I'll assume you just mean A and B, and that the termination resistor is across them.
Each receiver at the end of the 485 wires require 120ohm termination resistors, so in half duplex mode you need one on either end since both are receivers. Do you have one on the FTDI end too?
485 transmits 1/0s by separating the pairs by a voltage, why do you want 'failsafe' resistors that will tie up the bus? In an idle state the voltages should be the same between the two, which is what the 120ohm resistor also helps to do, you're working against this by pulling the separate lines apart and wasting power. You have 360+360+120ohm from VOUT to GND permanently.
What should the polarity of the stop bit be? Check the TX pin polarity of the UART and make sure its idle is 0 during the delay period(s). If it's transmitting a 1 it could be erroring because the FTDI receiver is never seeing the stop bit?(see update below)
The voltage levels themselves don't matter so much as there is a >200mV difference between A/B. The FTDI chip should have no problems determining what you're sending with a 3v difference, just as your receiver did with a ~2v difference.
Update: Since realizing that you are able to actually receive correctly the first few bytes before the bit error, this looks more like a timing issue. Try run at a lower baud rate and see if you get any more received data before the error.
First, if you want proper termination on a CAT5 cable, you need to replace both termination resistors with 100 to 120 ohms. That is the nominal impedance of the cable, and you need one termination at each end to prevent reflections. Note that your driver chip has a spec for performance at 54 ohms load. The fact that 54 is just about 110/2 is not a coincidence. However, a 5 foot cable is very short even for 1 MHz transmission, so I wouldn't worry about as long as you have a terminator at each end of the physical cable. Your existing 220 ohm resistors at each end of the line should be adequate given your line length and data rate, but I'd go for 120 ohms just to be safe. The internal logic on your slave units may be faster than your analyzer.
Second, Assuming your narrowest data pulse is 1 usec (which SEEMS reasonable for 1 Mb/sec data) looking closely at your upper screenshot indicates that your analyzer is only sampling at about 3 Ms/sec. The data sheet for the analyzer says the max sample rate is 12 Ms/sec, and 12 Ms/sec seems a reasonable number for the trace on the lower screen. I suspect your upper trace was taken with too long a window for the size of the data buffer, so the analyzer had to drop the sample rate. You need to be careful of this behavior. If you had acquired data over a window 4 times longer, the analyzer might well have dropped your sample rate to 750 KHz, and you'd have complete garbage.
Now look at the lower trace in the vicinity of 72 to 74 and 78 to 80 usec. Here you have alternating 1s and 0s, and note that, although the trace looks like sinewaves, the midamplitude width is as close to constant as makes no never mind. This means that, assuming the other data line is just the same but inverted, the recovered data cells will be of equal width, and since 0 and 1 bits will have the same cell width the data should be just fine.
In other words, there is no reason to worry about slew rates. The overshoot (what you call ringing) is very small, so that's not your problem either.
EDIT - I suspect that the slow transitions you're seeing is an artifact of your analyzer. I very strongly suspect that the input frequency response is limited to 6 MHz in order to match the maximum sample rate of 12 MHz. Nyquist limit, right?
Best Answer
When the DMA has handled the last byte, you need to wait for the UART to be no longer busy (typically, this is indicated by the
UCBUSY
bit).If your chip does not have an interrupt for that, you need to poll the bit, or use a timer that waits for 10 bit times.