Ethernet Cabling – Ethernet Cabling: UTP, STP, FTP, etc, When Is Shielding Useful/Needed?

cablesethernetgigabitshieldingstp

When installing cabling for Ethernet purposes, the question often arises: does one want/need simple UTP or any of the shielded variants, i.e. F/UTP, S/UTP, SF/UTP, U/FTP, F/FTP, S/FTP or SF/FTP, often labelled FTP or STP?

Most sites with information on the topic are from cable manufacturers or distributors, and all the shielded variants are of course deemed superior (and they are more expensive), but with very little justification. Beyond the additional cost of the cable (and sockets), shielded variants also require grounding, which complicates installation.

Cat6E, which supports up to 10G Ethernet, only requires UTP. Wikipedia states that:

Cable shielding may be required in order to avoid data corruption in high electromagnetic interference (EMI) environments

Are there real, objective, guidelines/rules on when one needs more that UTP for Ethernet? What are "high EMI environnements"? Is this something you are likely to find in regular offices (I've seen vague references to lighting equipment), or is this just for some extreme cases one is unlikely to encounter outside of industrial premises or research labs?

In my case, I'm interested about "run of the mill" 100 Mbits/s or Gigabit Ethernet, nothing fancy like 2.5G, 10G, 25G, 40G (I understand the latter two require various forms of shielding).

Some places also indicate (without justification) that shielded cables are required/better for PoE applications, though I fail to understand the relationship.

Best Answer

The problem with EMC/EMI is that it is hard to state rock-solid limits for power/voltage/magnetic flux as the resulting coupled-in signal strength is very setup dependent. However, there is interesting research on "no invasive" attacks on ethernet, e.g., https://download.hrz.tu-darmstadt.de/pub/FB20/Dekanat/Publikationen/SEEMOO/wisec2016-trust-the-wire.pdf figure 7 & 8. - while this is "the wrong direction" (i.e., detecting the emitted signals from the cable), it will give you a good idea about how much the shielding will dampen an external source. I am sure you will find plenty of other relevant publications if you use this one as a starting point.

Regarding potential EMI sources, there are standards all equipment (should) meet. For example, the generic IEC 61000 defines allowed limits for residential and industrial/commercial space for the maximum permitted emission and minimal required tolerance for immission. There are sometimes specific limits defined by standards for other applications like automotive, aerospace, military, and medical. If you know your surrounding and the installed equipment, you can look up the limits they (should) comply with and deduct the maximum expected interference strength. Then you can either simulate the interference with the adequate equipment or design your system by "simple" SNR calculation. In my experience, there will be a range where everything might work fine most of the time. However, you will sometimes notice decreased performance or erractic behavor due to sporadic immission leading to packet collision. The same apply for too long cables.

In the end the performance is a result of the achieved SNR. If your S is high enough you dont need to worry about N and vice versa. Considering your statement that you want to run only 100M/1000M and are likely in a standard office space, UTP should be fine.*

*I work in an office where we noticed issues with some of our oscilloscope readings. Later we realized that a neighboring company tested smart-grid stuff frequently in their lab - most likely, they did not adhere to residential limits.