Confusion about current and voltage

currentvoltage

I have been learning about electronics as a hobby. And I'm still confused about current and voltage. I've read many sites and watched many videos. But still have questions.

Here is what I learned so far.

Voltage is how much force the electrons are being pushed. Current is how many are flowing.
So If I increase the voltage, means I increase the pressure and current will increase too. There are different kind of power supplies, those supply different currents in same voltage. They reduce current using resistance and increase using transistors. Right ?

Now the confusion part, what I think is that how much current will be pulled from power supply is depend on the device that need the electricity. And if device pulls more current than power supply can push, power supply will heat up and blow.

Now lets say, I have an air conditioner, I think it takes around 10A. While a normal fan takes 1A. That seems to prove, it depends on device how much current it will take.
So if I put 100 air conditioner which will take 1000A, it will blow up the power supply. Means power supply was unable to push that much of current.

Now lets take a phone, that comes with a charge that supplies 200mA and good for phone. But if I use a phone charger than can output 1500mA, why my phone will charge fast ? It is suppose to only use 200mA(or enough) only, not 1500mA. And it might blow up the phone charging circuit or battery.

My actual question is, can we force an electronic device to use more current if more current is being pushed from power supply ? If yes, then how come a fan only takes 1A while air conditioner takes 10A on same power supply. Why fan is not blowing up ?

Please if possible, try to explain in simple words. Thank you 🙂

Best Answer

Think as power supplys as a constant voltage, rather than the current they can provide. So a constant voltage supply will try to maintain the same voltage independent of the load you put on it (until, as you said, it blows up). So, a fan is designed to "pull" or "let pass" 1A for a given voltage while an air conditioning device is designed to "pull" 10A for the same given voltage. Thats why they pull different currents. And, while you can "force" more current with more voltage, some devices are smart enough that they will try to compensate for that using their regulators (switching or linear) by having their own constant voltage supplies on the inside, thus maintining about the same current consumption up to a given voltage. Normaly supplys fail not because they fail to push the current, but because they fail to provide the current that is being "pulled". If you have a constant current supply, when you try to "push" more current to a given resistive load, the voltage will rise.

About the phone, battery charging ICs will often have a limit to the current they can charge (as will the batteries). Often on cellphones that limit is close to 1A. Hence you can charge it faster on 1500mA. The 200mA rating is probably based on the USB standard max current, and is obviously easier for the manufacturer to supply you with the phone because its cheaper than a 1500mA supply.

p.s.: to better understand different current draws for same voltages: http://en.wikipedia.org/wiki/Ohm%27s_law also remember that not all loads are resistive(most arent)