Imagine, for a minute that electricity travels quite slowly.
When you turn your light switch on what happens? Current starts to flow - it starts working its way down the wire and so does the voltage. The current that flows is determined by two things: -
- The voltage and
- An "impression" of what the load resistance might be.
Will the current be too small or will the current be too much? It is the cable (and its properties) that dictate the amount of current flowing.
Voltage and current are traveling to the "unknown" load and because V and I are flowing there is power flowing (P = VI). When the current and voltage reach the bulb, if the bulb's resistance doesn't match the V/I relationship not all the power is consumed.
This means the excess (or deficit) of power has to be reflected back up the wire to the switch. It's got nowhere else to go.
In the real world of data comms or radio, this causes "reflections" and these can add or subtract to the forward power traveling down the wire and, in the case of data, it can become misshaped leading to possible data corruptions. In the case of an RF carrier, there will be points along the wire where it appears unmeasurable.
The cable dictates how much current initially flows based (mainly) on its inductance, capacitance and resistance. The formula is this: -
Characteristic impedance = \$\sqrt{\dfrac{R+j\omega L}{G+j\omega C}}\$
R is resistance per metre, L is inductance per metre, C is parallel capacitance per metre and G is parallel conductance per metre. At high frequencies (>1MHz) the impedance starts to largely become: -
\$\sqrt{\dfrac{L}{C}}\$ and if you look at some coax specs you'll see that 50 ohms is the result of this calculation.
why can't I use a 75 ohm in place of a 50 ohm on the output of my
transmitter?
Hopefully, by now you should be able to answer this.
If you used RG-58 at about 1dB loss per meter, you could theoretically run a cable of about 80 to 100 meters, depending on the output power of your transmitter. In reality, I would rather keep it to 50-60 meters and place a 5 dB attenuator at each end to make sure I wasn't saturating the receiver. You can take out the hard attenuation when you're done but I like to gradually work up to full power to make sure there aren't any kinks.
Also, if you want to hook up multiple devices, just have your clients linked through a power divider (2-way, 4-way, etc.).
Basically (excuse my first time math syntax)
\begin{aligned}
\ distance = \ (PowerOut - ReceiverMinimum)/ dBLossPerMeter \\
\
\
\end{aligned}
Best Answer
A "better" way is to send 12 V (or more) over the coax and have local regulation to 5V at each RaPi outlet. You can use cheap buck regulators available on Ebay (a few GBP or dollars) to take the 12 V DC and efficiently convert to 5 V locally.
With 12 V being sent down the wire and with local switching buck regulators, the overall current down the coax is less than 50% of the current had you put 5 V on the line and this immediately drops less voltage and makes the whole system more viable.
You might even consider using DC-to-DC (isolating types) converters at each RaPi connection to avoid "earth" issues - they would also give a measure of protection against local (not direct) lightning strikes.