Electronic – How much 12V DC current can I safely draw from an AC 230V 13A socket and why

acdcmaximum-ratingssafetysocket

In my cabin I have a sollar cell and a 12V battery that I use to power my electronics. However I'm dissatisfied with the standard CL sockets used in cars (apparently I'm not the only one) and I decided to reuse a foreign AC socket instead, specifically the United Kingdom's “13 amp socket” (BS 1363).

I chose the UK socket because it's well designed and it's very rare that anyone has UK plugs in my country. I will take actions to ensure no-one will thoughtlessly plug an UK electronic device to these sockets!

The socket is designed for 230 V and up to 13 A of alternating current. That means it can supply up to ~3 kW of power. However with the 12 V I'll be using, thirteen amps can barely reach 150 W. Since some popular water pumps need ~700 W, this seems like a serious limitation.

In a different question here on Electrical Engineering SE, someone asked whether they can use a 10A@250V-rated socket to draw 13A@230v and the most popular answer says that “the only important parameter is the current rating, […] 10 amps through the socket will heat it up just as much at 1 volt as it will at 5,000 volts.” Sadly, they don't provide an explanation why. Does this apply to my case and why / why not?

Another thing that concerns me is I'll be using DC instead of AC. There are several things that work differently with direct and alternating current. Is it safe to use the socket this way?

EDIT: There is a question about the current rating of fuses where they explain why fuses blow at a certain current, not power. However I failed to understand the reasoning in the answers, so I can't tell whether it also extends to my case, or not.

Best Answer

Power is current times voltage (P = IE). You don't mention if you're converting from one voltage to another. Are you using a step-down converter? Are you merely charging the battery with the solar system and want to know how to achieve equivalent power from a charged battery bank?

230V AC 13A is 2990 watts. 2990 watts at 12V DC would be ~249 amperes. This means your battery(ies) (and all the connectors and cabling) would have to be capable of safely delivering 250A in order to have roughly equivalent power to your UK mains example.

Because you said "13A at 12V can barely reach 150W" it seems like you're already aware of the relationship between voltage, current, and power. It also sounds like you're looking for a better socket to use for your system, and chose the UK mains style because it won't likely be confused with the real thing.

So here's what you are maybe missing, related to my question from 8 years ago about fuses. Voltage is what "motivates" electrons to go through a particular thing, whether it's a fuse, a wire, or a connector. The current is "how many" of them. The larger the current, the greater the friction, and thus heat. A connector or wire rated for 13A is not going to handle more power unless the voltage is also higher. In the case of a connector, the voltage rating will be mostly applicable to the distance between conductors (to avoid arcing) while the current rating will be applicable to the robustness of the conductors.

Put another way, 13A can have wildly different power values based on voltage, but it's always going to be the same quantity of electric current flowing. If the voltage is 12V, you're right, it isn't a lot of power, but it still requires thick wire and connectors to safely handle that current without having an unsafe temperature rise.

You will likely want to use something capable of much higher current. Automotive applications that use 12V systems often have fuses and wire rated for 50 or more amps. But be absolutely sure to look at the specifications for your battery (or battery bank). There's no point in installing 60A-capable wiring and connectors if your source can only supply 40A. Also be sure to install fuses to limit current to less than the weakest link in the system.

Related Topic