Say I have a 9V supply, and I want to power an LED that requires 3V and 0.02A. From what I understand, I need to add a resistor that will drop 6V, and reduce the current to 0.02A, and I can calculate the resistance needed using ohms law. So, 6/0.02 = 300Ω – simple enough.
But how can this be the correct resistor when there's plenty of other equivalent fractions that would produce the same result? 3/0.01 also gives 300, so surely you could calculate that this resistor will only drop 3V and reduce the current to 0.01A? What am I missing here?