Electronic – Best way to test a USB wall charger for output current

chargingcurrent measurementpower supplyusb

USB chargers are tricky. Or at least they are when you don't follow the standards.

I have a device that uses 5V to charge a battery. The device has a USB socket, with the idea of connecting it to any USB power supply (maybe USB power supply is not the best way to call them, but I mean a 5V DC wall wart that has a USB port, whether that wall wart adheres to USB specs or not regarding power delivery, I don't care as long it outputs 5V), but doesn't do any handshaking to ask for power. I was wondering what will happen if you use a charger from brand X or Z, will it output 100mA? will it output 500mA? will it provide as much as it is rated (e.g. 2A on an iPad charger)?

My question is, given that different manufacturers seem to use different ways to get more than 500mA from their chargers (at least to my knowledge) how can I get the maximum current a power supply can provide without needing to negotiate the power delivery? I was thinking in using an active load, but I don't know if that's the best way to test it. Is there any considerations I need to keep in mind to do the test?

I would need to get the active load, as I don't have any at the moment to test it myself and I wanted to ask first before purchasing it.

EDIT: The question seems to be too open to interpretations, but I don't know how to word it while keeping it short, sorry. I guess my question is: when using a charger that provides more than the usual 500mA, and that uses a proprietary system to allow the drawing of more current (e.g. a tablet charger from a reputed brand), what should my load by in order to extract the maximum current from it, without having to use ICs to do any kind of negotiation? Can I just connect a variable resistor and check when it starts to drop voltage? Can I get 2A from tablet charger that is obviously capable of doing it, without requiring whatever resistors it needs in its data lines?

I appreciate the answers given so far, as they are useful, but not exactly what my (terribly worded) question was looking after. I don't want to know how to measure that current, I know I can use one of the many flavours of ampmeter available, and I don't want to know how to do the negotiation to get more power. I want to know how to set up my test so if I get X amps, that's the maximum I can get from the charger without trickery.

Best Answer

You ask, "how can I measure the maximum current a power supply provides"? This is an ill-posed question. A power supply can provide the maximum current that it is designed to, which is usually written on the power supply case. So you don't need to measure anything in the first place. This is maximum guaranteed, without overheating or dropping designated voltage level. However, you indeed can research if the particular PSU meets it's advertised specifications and measure all parameters using known loads and just measure output voltage.

Another misconception is expressed in your comment as "or whatever the charger thinks it's best". Chargers do not "think", they just "supply" whatever they can within the limit of their "advertised" capability, which they advertise via various "signatures" on D+/D- lines, many variants of explanations can be found here. It is a function of device (or your charging circuit) recognize the signature and to take the current that does not exceed "provider" capabilities. If your device can't recognize the charger's signature, it should limit its consumption to bare minimum of 500 mA.

Related Topic