I want to test my power supply at the 12 V 10 A specification. I had tried with connect a 100-watt 0.7-ohm load resistor across it, but by that time the voltage went down to 3 to 4 volt. How does it happen? And how can I test this power supply?
Electronic – How to test the power supply across the load
current measurementpower supplyvoltage
Related Topic
- Electronic – Why is the regulation of the 5 volt power supply so poor and how to address it
- Electronic – Dual power supply gives higher voltage as soon as load is connected
- Electrical – Understanding DALI power supply
- Electronic – Making a bench power supply to test car & bike starter motors at no load
- Electronic – Why is the voltage across the nichrome wire so low
Best Answer
Start by using the right load. The supply is rated for 10 A maximum at 12 V. By Ohm's law, that means the smallest valid load resistance is (12 V)/(10 A) = 1.2 Ω.
By connecting a 700 mΩ resistor to the supply, you violated it's current spec. Again by Ohm's law, (12 V)/(700 mΩ) = 17 A. The supply dropping its output voltage when you attempt to draw more than rated current from it is a totally reasonable thing for it to do.
You really need to look up Ohms's Law and understand what it means.
Also consider the power the load resistor must dissipate. If you manage to load the supply to its maximum rating, then the power into the resistor will be (12 V)(10 A) = 120 W. Even if your resistor were the right resistance, it doesn't have enough power handling capability.
You could get a second 700 mΩ 100 W resistor and put them in series. That effectively makes a 1.4 Ω 200 W resistor. That is within what the supply can drive and the combined resistance can handle. By Ohm's law again (yes this comes up a lot and is really useful), (12 V)/(1.4 Ω) = 8.6 A. That's what the current will be thru the combined resistor. It doesn't test the supply to the limit, but it's a good start.