Electronic – How often are bits sent over cat 5 ethernet cables

ethernet

sorry if this is a bit of a dumb question, but I read somewhere that cat-5 is 4 twisted pairs. so that means at most it's transmitting 4 bits at a time. What determines how often the 4 bits are measured in order to create a packet structure?

I know how the ethernet frame is laid out, but I'm really confused about what defines how often each bit is measured? I assume that there is a standard for each type of wiring, but I haven't been able to find it.

if the 4 bits are 1011 for 2 sec, you get vastly different data based on how many times it's measured in that one second. So that must be defined for each type of wiring somewhere i assume.

Best Answer

if the 4 bits are 1011 for 2 sec, you get vastly different data based on how many times it's measured in that one second

This indeed a [potential] problem in communication in general (ignoring the fact that you've got quite a few details about Ethernet wrong). There are numerous schemes to avoid this issue, like encoding a clock signal in the data, e.g. Manchester code, which is used in some (generally older) versions of Ethernet (10Mbit). Fast Ethernet (100BASE-TX) uses 4B5B encoding and also a three-voltages scheme called MLT-3, which serves a dual purpose of combining the data with a clock and to reduce the spectrum.

Gigabit Ethernet over twisted pairs (more precisely 1000BASE-T) uses five different voltage levels so it's a "quinary" code at the wire level. You'll want to look up 4D-PAM5 encoding for details... which are pretty hairy. Essentially every 8 bits of input data (256-symbol space) are converted to a point in a 625-symbol space. But only 512 of the latter are used, and they are selected in complicated manner (only specific sequences are allowed) designed to minimize the likelihood that over-the-wire errors will confuse the receiver.

Here's a nice summary image for the popular Ethernet types over copper from this book:

enter image description here

The other stuff for GigE in that figure is a pseudo-random scrambler (used to spread the spectrum and reduce the DC component) and a Forward Error Correction block, more precisely a trellis encoder (a convolutional encoder); a Viterbi decoder is used at the receiving end to correct errors.

Another interesting bit (in relation to your highlighted question) about Gigabit Ethernet is that whenever two such devices are connected one is chosen to be master and the other one is chosen to be slave. The slave's clock is synchronized to the master's via a continuous stream of symbols. A side-effect of this design is that Gigabit Ethernet sends stuff on wire all the time, even when it got no data to send... in which case it just sends IDLE symbols just so it keeps the slave's clock synchronized. Actually Fast Ethernet did/does the same thing in this respect (IDLE symbols), and it's easier to put that on a graph:

enter image description here