Signal Theory – Do Signals Need to Be Maintained Until They Reach Their Destination?

propagationsignalsignal-theory

If I send a signal through a wire having delay of 10ms, then should I need to maintain the source of the signal till it reaches the destination or can I change after, say, 2ms and send the next signal. The 1st signal reaches destination at 10ms and 2nd signal reaches destination at 12ms.

Best Answer

Years ago I designed a video system that sent a composite video signal down a coax cable from the video controller to the display CRT. This system used monochrome video with a 20MHz dot clock to display text and graphics on the CRT.

Note that at 20MHz each pixel of video was 50ns wide. At the time the typical font used on the CRT was 5 pixels wide and had a 6th pixel of spacing to the next character. Thus the total width time for one character on the screen was 6 * 50 = 300ns.

During system testing we wanted to check the integrity of the video signal over a length of coax cable that would be maximal in comparison to the typical installed application of the system. So one day we traveled to a large surplus electronics outlet and purchased a large wooden spool that had about 450 feet of coax cable on it. (It was rather entertaining to other building occupants when we rolled the spool down the halls to our suite in the office building).

When we tested the cable system we placed two channels of our DSO (digital storage oscilloscope) with one channel displaying the video driver feeding into the coax cable and the other channel displaying the signal coming out of the coax cable at the receiving CRT. The scope was triggered off the trailing edge of the vertical blanking signal so that we could capture the video signal for the left edge of the very first scan line. The video was setup to display two "H" characters as "HH" on the first row of characters.

The video signal for the top row of pixels of the displayed characters could be depicted as follows with X representing active video and _ representing off video:

X___X X___X

When displayed on the scope waveforms looked like the following:

enter image description here Picture fabricated in Visio

Channel 1 (upper trace) was the cable input and Channel 2 was the cable output. The signal transit time down the cable was approximately 600ns or just short of 1.5ns/foot.

So the short answer to the question is that, No it is not necessary to maintain the input signal to a transmission line until the current edge arrives at the far end.

It is interesting to note that analogue oscilloscopes often have a coil of coax inside is used to delay the signal from the input section to the display section. The intent being that it allows the very first part of the signal to be displayed on the left side of the screen relative to the trigger signal moving the electron beam to the left edge of the CRT.