PoE losses: Cat5e vs Cat6

cablepower-over-ethernet

Just a simple physics question: per this link, Cat5e twist rate is lower than that of Cat6, and Cat6 cables are (on average) thinner. A higher twist rate means a longer copper wire per length of cable. Both these facts would make one assume that the resistance per unit distance of a Cat6 cable would be higher than that of a Cat5e. Presumably this is not the case, as modern PoE standards are allowing for higher power while keeping the voltage the same, meaning that more power would be dissipated in the cable if it were true. How is this mitigated? Is the copper of higher grade?

Best Answer

The cable category defines the high-frequency parameters of the cable (mostly attenuation and crosstalk). For PoE, the serial resistance of the cable matters which isn't defined by category.

Good plenum cable grew somewhat thicker by custom from Cat-3 over Cat-5 to Cat-6(A), patch cables vary greatly down to 30 AWG.

Essentially, the thicker the cable (lower AWG) the better the PoE performance. The initial IEEE 802.3af-2003 defined a maximum loop resistance between pairs (thx jonathanjo) of 20 Ω, 802.3at-2009 lowered that to 12.5 Ω and the new 802.3bt-2018 to 6.25 Ω. Accordingly, the maximum current increased from 350 mA (af) to 600 mA (at) to 1860 mA (bt 4-pair), enabling the power increase.

The loop resistance limits the maximum power due to the induced voltage drop: at 600 mA current and 6 Ω cable resistance, the voltage drop is (U = R*I) 3.6 V - from the perhaps 48 V you start with at the PSE you've got 44.4 V left at the PD. Multiplied by .6 A, that's 26.6 W with 3.6 * .6 = 2.2 W lost in the cable. (Note that the 25.5 W maximum 802.3at power is worst case.)

Related Topic