Electronic – Transmission lines – justification of “wavelength / 10” (or similar) rule

transmission line

I have very little background in electronics, but suddenly I need some transmission line theory for a project I am working on. I have heard of various rules of thumb for how long a wire can be before you have to start worrying about its properties as a transmission line. I know these are pretty arbitrary, but the most common I have heard is that you have to start worrying about transmission line properties once the length of your wires are greater than about wavelength / 10, so I'll just go with that for this post.

My question is: what quantity or quantities are minimized by using wires that are short relative to the wavelength that makes it so that you can ignore transmission line theory? I.e., when our wire length is less than wavelength / 10, presumably there's some set of physical quantities that makes our transmission line look very wire-like in models. What are these?

As an example of the type of thing I'm looking for, here's what my first guess was: At short lengths, the phase difference for a wave would be small enough that the wave's amplitude at the two ends of the wire would be roughly the same. But this doesn't hold: sin(2 pi / 10) - sin(0) = 0.59, so if we use the wavelength / 10 rule of thumb, the amplitude change in the wave between the two ends of the wire could be more than half of the total amplitude.

As of now, I'm guessing that the actual answer is that the characteristic impedance of a wire is very near 0 (or whatever tiny resistance the wire has) at short wire lengths – i.e., you can ignore its capacitive and inductive properties. Is this correct? If so, are there some formulas relating Z_transmission_line to wire length which show that Z ~ R at lengths of about wavelength / 10? What I'm looking for is something where I can literally plug in length = wavelength / 10 and get some number close to what I would get if I plugged in length = 0, that sort of acts as a heuristic for how good a certain length is.

Thanks for any help!

Best Answer

When propagation delay exceeds 10% of wave time , and impedances are not matched throughout, reflections occur which alter the effective input impedance and Ohm's Law. The amount % depends on system tolerances for return loss, so this is a ballpark figure.

Remember that if the prop delay is 25% of \$\lambda\$ the output impedance if mismatched is inverted at source, ( short becomes open and open becomes short). So Ohm's Law applied to impedance ratios or in other words H(s) transfer functions will be false and more complex equations must be used.

Consider that all conductors ( wires , tracks, leaded parts ) have inductance based on the length to width ratio. So a perfect conductor rises in impedance with f. and if the applied f starts approaches 10% of λ then its impedance changes rapidly due to reflections.

Related Topic