# DC boost converter and Ohm’s Law

dc/dc converter

A circuit has a resistance of 1 Ohm, a DC power supply rated at 9W is delivering power to the circuit. Using a DC boost-converter that takes in an input of 9W can it output different voltages? Even if the resistance is 1 ohm?

If the resistance is 1 Ohm and the power is 9W, from ohms law voltage is 3V and current is 3A, but can I use a DC boost converter to increase voltage to higher voltages?

EDIT: The location of the resistor is between power supply and the load, if a converter or constant current was added, the resistor will be after the converter/constant current source and before the load.

The text below is only valid for an ideal resistor (e.g. one without parasitic effects, which should be good enough for the considerations below)

For the answer below I interpreted the question as to ask for information on this schematic:

simulate this circuit – Schematic created using CircuitLab

Since you did not mention any voltages, I'll answer a little bit more generic:

It is not possible is to have a resistor that does not adhere to Ohms law (R=U/I). A simple resistor will adhere to this law/equation at all times, therefore: If you want to build a boost converter that outputs e.g. 5V into a 1 Ohm load, it will have to deliver 25W of power. (P = U^2 / R). This is certainly possible.

But: The boost converter would need to be supplied with enough energy to do that. If your boost converter has an efficiency of e.g. 85%, you would need to supply the converter with ~30W. In this case, your power supply would not be sufficient.

A DC/DC converter can not produce energy/power out of nothing. It is a tool to change the voltage of some supply to another voltage with acceptable losses, nothing more.

If you only have a 9W power supply you can never supply a resistor with a voltage higher than 3V (U = sqrt(P*R), same eq. as above, and that only if you have an efficiency of 100%), you simply do not have the power.

What will happen if you try depends on your power supply.

Some of the possibilities:

• The power supply might switch itself off, since your exceeded its specification
• The power supply might go into current limitation (and basically become a constant current source)
• The power supply might oscillate
• The power supply might get to hot and destroy itself (hopefully not, but many cheap and or badly designed ones do)

Regardless of that, you can not have a resistor e.g. dissipate 20W of power on a 9W supply.

Edit: Further explanation regarding "constant current mode"

Regarding your second comment, and my point about the constant current source. This was just meant as an explanation what a real-world power supply might do if you try to consume more power than the power supply can deliver. Behaving like a constant current source is one thing that can happen in that case:

Some power sources (e.g. most lab supplies) are built in a way that they have a set voltage and a set current. Whichever one is the one that is the limiting point at the moment, will be the one used. Say we set the power supply to 1V and 1A, and connect a variable resistor. When you turn the variable resistor to its maximum resistance e.g. 10k the power supply will be in constant voltage mode (voltage at 1V and current at I=U/R=100uA). If you turn the resistance down, e.g. 0.5 Ohm, the power supply will go into constant current mode (current at 1A, voltage at U=IR=500mV). But the power delivered will never exceed 1W (P=UI)