Electronic – Powering a device by USB

amperageohms-lawresistanceusbvoltage

I would like to power a device via the 5V USB port on a computer, because I will also be communicating with the computer via the USB data cables.

I know that USB ports provide 5V, but from what I've read here.

USB 1.0, 2.0, and 3.0 all give variable amperage outputs. The device is supposed to negotiate with to computer to ask for more amperage, and the default is set at 100 mA.

I would like my circuit to run on 10 mA. I understand the relationship between amperage and voltage. (I.E. Ohm's Law)

Using:
$$ R = \frac{5V}{0.01A} = 500\Omega. $$

Perhaps I'm having difficulty understanding the concept, but does this mean I will receive 10 mA regardless of how much is flowing out of the USB port?

Best Answer

The power supply doesn't always output its maximum current (unless needed), instead it outputs a particular voltage (in your case 5v) and the load presented by the device determines how much current will be drawn. If there is nothing connected to the output, the current will be zero.

However if the load attempts to draw too much current (i.e. more than 100 mA for a USB port that is rated for only 100 mA), then depending on the power supply, the voltage may sink below 5v, or the supply may stop working altogether due to a over-current shut-down mechanism.

As you stated, the current is determined ohms law, i.e. the voltage divided by the equivalent resistance of the load. As you already calculated, for a 10 mA current, the equivalent resistance of your load is 500 Ω. So if you replaced your device by a 500 Ω resistor, it would draw 10 mA. Obviously your device is much more complicated than a simple resistor, but that's what it looks like to the power supply.

In many cases, the load will not be a fixed amount. A trivial case is two LED's, each drawing 20 mA. One is on steady, and the other is blinking on and off. So the load varies between 20 mA and 40 mA. The supply will automatically adjust for this varying current.