Electronic – Transmission line, cable impedance

impedancetransmission

I am currently doing a design where I basically have two different types of (digital) signals. The first signal group is fully differential (P/N pair). The other group are single ended signals referenced to ground.

I would like to transfer both these signals via Twisted Pair cabling to another board. The cable producer specifies that the twisted pair cable has a (differential) impedance of 100Ohms, which is what if perfectly fine for the differential signals.

For the single ended signals, lets assume I have an output driver impedance of 50 Ohms and an input impedance of 50 Ohms as well so what I would basically want is a cable impedance of 50 Ohms.

How is this issue addressed in the real world? Is this an issue at all? Is the differential impedance of 100 Ohms "the same" as a single ended impedance around 50 Ohms?

What I understand is that a typical coaxial cable has an impedance of around 50 Ohms, but what about single ended signals on a twisted pair cable?

Best Answer

You can get the effect of a 50\$\Omega\$ impedance line by putting two 100\$\Omega\$ lines in parallel, if you have sufficient number of pairs to do that.

The reason differential lines tend to be 100\$\Omega\$ is they expect the user to use an anti-phase pair of 50\$\Omega\$ drivers and receivers for them. Being anti-phase, the driver impedances are effectively in series, to match the line, and the same for the receivers.

Of course this is all moot if your signal edges are slow enough that the impedance of the line doesn't matter. If the signal rise time is several times the electrical length of the transmission line, then you can ignore proper matching with little problem.