Expected power loss on a PoE (Power over Ethernet) device

poe

I am just starting with PoE (Power over Ethernet), and I would like to use it to power a remote device (Buffalo WHR-HP-G54, manual here) that requires 5 volts and 2 amps DC (so, 10 watts, I assume).

If my PoE injector is capable of sending via the Ethernet cable, say, 15 watts of power, maybe the signal attenuation (power loss, or whatever is the correct way to name it, I don't know much about electronics) arrives too weak at the destination and is not capable of delivering the needed 10 watts?

Of course, I think this must depend on the Ethernet cable length and type.

This question is built with data examples in order to understand the basics of power loss for PoE.

If possible, I would like to have some sort of generic rule to calculate such power loss (maybe depending on cable length, type, etc.).

Examples of calculation rules:

  • Each 10 meters of UTP unshielded cable loses 1% power from the Watts max in the origin (14,85W after 10 meters, 14,7W after 20 meters, 14,55W after 30 meters, etc.).
  • Each 100 meters of FTP shielded cable looses a 5% power (14,25W after 100 meters, 13,5375W after 200 meters, 12,860625W after 300 meters, etc.).

Real-world example PoE injector (for those pretending exact calculations):

  • Reseller: TP-Link
  • Model: TL-POE150S
  • Máx power: 15.4 watts
  • Voltage: 48 VDC

Best Answer

The answer depends on the gauge of the conductors. Most network cable is 24 gauge but some category 6 is 23 gauge.

The calculation you're looking for is "voltage drop."

Related Topic