Electronic – Why don’t we use low voltage Power Sources for high wattage applications

ohms-lawpowerpower supplyresistorsvoltage

Super nooby question involving Ohm's law, but this has been on mind this morning.

Say I have a 60W device, and I want to power it. Usually this calls for a 120V source or something. However, why not use a 5V source and draw 12A with really low resistance? Is it for safety purposes mainly? Or is there an issue with getting the resistance low enough to achieve the 12 amps?

I tried googling this but not much came up. Probably really obvious but just wondering..

EDIT for duplicate mark: The duplicate suggestion is similar; however, it discusses series vs. parallel cells and adds interesting information, but isn't exactly what I was asking about. The answers provided on this post were much more useful to me.

EDIT 2: I added my original edit back now that the duplication mark has gone through.

Best Answer

You are right in that power is the product of voltage and current. This would indicate any voltage x current combination would be fine, as long as it comes out to the desired power.

However, back in the real world we have various realities that get in the way. The biggest problem is that at low voltage, the current needs to be high, and that high current is expensive, large, and/or inefficient to deal with. There is also a limit on voltage above which it gets inconvenient, meaning expensive or large. There is therefore a moderate range in the middle that works best with the inconvenient physics we are dealt.

Using your 60 W device as example, start by considering 120 V and 500 mA. Neither is pushing any limits that result in unusual difficulties or expense. Insulating to 200 V (always leave some margin, particularly for insulation rating) pretty much happens unless you try not to. 500 mA doesn't require unusually thick or expensive wire.

5 V and 12 A is certainly doable, but already you can't just use normal "hookup" wire. Wire to handle 12 A is going to be thicker and cost considerably more than wire that can handle 500 mA. That means more copper, which costs real money, makes the wire less flexible, and makes it thicker.

At the other end, you haven't gained much by dropping from 120 V to 5 V. One advantage is safety rating. Generally at 48 V and below, things get simpler regulator-wise. By the time you get down to 30 V, there isn't much saving in transistors and the like if they only need to handle 10 V.

Taking this further, 1 V at 60 A would be quite inconvenient. By starting at such a low voltage, smaller voltage drops in the cable become more significant inefficiencies, right when it becomes more difficult to avoid them. Consider a cable with only 100 mΩ total out and back resistance. Even with the full 1 V across it, it would only draw 10 A, and that leaves no voltage for the device.

Let's say you want at least 900 mV at the device, and therefore need to deliver 67 A to compensate for the power loss in the cable. The cable would need to have a out and back total resistance of (100 mV)/(67 A) = 1.5 mΩ. Even at a total of 1 m of cable, that would require quite a thick conductor. And, it would still dissipate 6.7 W.

This difficulty in dealing with high current is the reason that utility-scale power transmission lines are high voltage. These cables can be 100s of miles long, so series resistance adds up. Utilities make the voltage as high as they can to make the 100s of miles of cable cheaper, and for it to waste less power. The high voltage does cost some, which is mostly the requirement to keep a larger clearance around the cable to any other conductor. Still, these costs aren't as high as using more copper or steel in the cable.

Another problem with AC is that the skin effect means you get diminishing returns in resistance for larger diameters. This is why for really long distances, it becomes cheaper to transmit DC, then pay the expense to convert that to AC at the receiving end.