Electronic – How does electrical power relate to Ohm’s law

currentohms-lawpowervoltage

I have some difficulty grasping these concepts. Let's say, for example, a power source of 10W operating at 5V is connected to a load of 0.5 ohms. According to Ohm's law, it is expected that a current of 10A should flow through the circuit. However as given above, the power is 10W so a current of 2A is expected, using the voltage-power relationship.

My question is: What is the expected current in this particular case and why?

Best Answer

10W is the maximum power that the supply can provide. The actual power (and current) will depend on the load connected to the supply.

In your example, the smallest resistor that can be safely connected to the supply is 2.5 ohms, which will result in a current of 2A and power of 10W. If a resistor smaller than that is used, it will attempt to draw more than 2A, with a power greater than 10W. What happens next will depend on the power supply, but the supply's output voltage will fall below the rated 5V, and the supply may overheat or the protection fuse may trip.