Multiple level signal transmission

communicationtransmission line

I'm interested in increasing the transmission rate over Cat5 for a personal project. (Cat5/5e/6 will be chosen because of it's properties and obtainability but will not be used in the standard way)

In any case what I'm interested in using a "multi-level bits" per clock of transmission. Digital communication almost always uses a single bit per clock.

For example, using a single bit(0 or V) in cat5 at 100MHz allows only for a 100Mbits data rate. Using 4 bits allows for and effective rate of 400Mbits/TP(or over Cat6a, 2GB/TP). For my project this is significant.

Obviously adding more bits per clock increases complexity and decreases the noise immunity but I feel that it could be done with a few logic gates and will work for my particular application.

Since the signals will be differential it seems it may be pretty easy on the tx side by actually not having one of the lines being the mirror images but independent. Noise immunity will still be retained due to the differential nature.

Essentially on each end of the line we will have 4-bit DAC(for tx) and ADC(for rx) to convert to and from the multi-level representation.

My question is, besides the added cost and complexity(which doesn't seem to be all that great?), are there any other reasons why it is bad.

I conclude that there seems to be no loss of noise immunity:

  1. Asymmetric differential signaling is used. External noise will be canceled out in exactly the same manner when symmetric different signaling is used(assuming that the noise does not depend on the voltage in the wire)

  2. The noise margins can be kept the same by simply amplifying and attenuating the signal before and after transmission. e.g., normally 0-3.3v are used. We can amplify this to 0-26.4V which will give the same distance between each level(3.3V). Of course, the cost is 8 times the transmission power but this would probably be similar to when using a single-bit/clock rate that is 8 times larger.

Therefor, best I can decern, the real cost in achieving around an order of magnitude in throughput is a few dollars in parts. (a monolithic tx and rx ic would reduce the complexity to near 0).

Is there any real downside besides what I have listed. Basically what are the practical drawbacks?

Best Answer

There are some good reasons not to do what you imagine.

First, the signal level at your receiver will vary quite a bit depending on the properties and length of your cable (and if you are out of luck, also on the phase of the moon). You will have to calibrate your receiver for the particular cable.

Reflections of large signals (0->1111 transition) might swamp small signals (0->0001 transition).

A 4-bit (=16 level) flash AD at 100 Mhz is not trivial.

Generally, doubling the baudrate is often much easier than doubling the number of levels that have to be resolved.