Signal Processing – Why Self-Synchronization is Necessary

signal

I am new to electrical engineering. Just a question on lack of synchronization problem. Below is a picture from my textbook:

enter image description here

I am very confused, time is unique to everyone in the world, so if both sender and receiver agree that, for example, the pattern of 0.001s represents a bit, so we won't have any synchronization problem any more, isn't it?

Best Answer

if both sender and receiver agree that, for example, the pattern of 0.001s represents a bit, so we won't have any synchronization problem any more, isn't it?

This would work in theory, however it requires both sender and receiver to have infinitely accurate clocks that will not drift relative to each other.

Real world clocks always have some inaccuracy and drift. Quartz oscillators are pretty good, especially considering how cheap they are, but they are not perfect. There is no perfectly accurate clock with zero drift.

Say your sender and receiver both use 1MHz +/-50ppm clocks. In the worst case, one clock will run at 1000050 Hz and the other at 999950 Hz, so you get 100ppm drift between the two.

The only practical way to have two synchronized clocks is to actually synchronize them by slaving one clock to the other.

Also, time is not "unique to everyone in the world" as you say. For example, relativity predicts that gravity influences time, so the frequency of a clock also depends on how far it is from Earth (ie, altitude)...

If the sender and receiver are communicating via radio, and they are moving, then a doppler shift will occur and transmission delay will change. For example if a cellphone transmits at 2GHz from inside a car moving at 100km/h away from the cell base station, then the frequency the receiver gets will be doppler-shifted by about 185 Hz. Also the transmission path length will change over time, which changes propagation delay. The receiver must account for this (among lots of other factors).

Even if you had two perfect clocks, propagation delay would still have to be accounted for, say when the user replaces a 1 meter HDMI cable with a 2 meter HDMI cable. That extra meter would add about 4.3ns delay (assuming 70% speed of light in the cable) corresponding to about 15 bits (per lane) at 3.4Gbps.

That's why clock is usually transmitted with data (either using its own wires or embedded in the signal) to allow the receiver to synchronize its local clock.

Related Topic