Let's say we have perfect 5V DC 0.1A power supply. Obviously for one LED we would need a resistor (I'm well aware of Ohm's law), BUT what if we connect bunch of LED's in series. And the current drain is well above the power supply, so voltage on those LED's drops to 2.5-3V. Would you still need to add resistor if LED's voltage is about 3.2V? It might be a silly question, but I thought it'd be cool to not have a resistor in such case (because LED's would be brighter that way) and better ask and be sure than be sorry, right!? Thank you in advance!
Is resistor on LED’s necessary after voltage drop
current-limitingledresistorsvoltage
Related Solutions
You don't want to control LEDs with voltage. It may appear to you that you get better control with voltage because a small voltage change causes a large apparent brightness. With current control, the LED brightness will be roughly linearly proportional to current. However, humans perceive brightness logarithmically which is why it may look to you like current control isn't working as expected.
The voltage to get a particular current and therefore brightness will vary between LEDs and also has a significant temperature dependency. You really want to control LEDs with current, not voltage.
You also seem to be asking about a linear control from a higher voltage such that the extra voltage times the LED current gets burned up as heat. PWM is used because it's more efficient. You design the circuit for reasonably constant current at the maximum the LED can handle, and have just enough voltage to guarantee this current. That means the system is pretty efficient at that max current. PWM then switches between that efficient on state and the off state, so the result is still efficient even when the current is half the maximum, for example.
However to answer your question, you can control the LED from a voltage that is low pass filtered from a PWM output. You have to put some kind of buffer or amplifier between the filtered PWM output and the LED. If I had to do it this way, I'd use a pass transistor driven by a opamp. The trick is to put a small low side current sense resistor between the LED and ground, and have the opamp make the voltage accross that resistor proportional to the filtered PWM signal. That will give you true current control.
Your estimate is off by several orders of magnitude. Wikipedia gives the resistivity of air as being around \$10^{16}\ \Omega \cdot m\$. I'd guess an actual resistance between two points would be at least on the order of teraohms. Assuming \$1\ T\Omega\$, that gives a current of 5 picoamps, which is far too small to measure easily. As pointed out in an answer to another EE.SE question, the material the battery is made of is probably a better conductor than air.
To actually figure out what's going on in extreme situations, you need a more detailed model of the materials involved. How many electrons and/or ions are available for conduction? An ideal dielectric (insulator) has no free electrons, but a real dielectric might. What's the strength of the electric field? If you have a 40 kilovolt voltage source, you can rip apart air molecules, creating lots of free electrons! A less extreme example would be a vacuum tube, which "conducts" through empty space \$(R = \infty)\$ using electrons liberated from a piece of metal.
Ohm's law is an approximation that works for many materials at low voltages, frequencies, and temperatures. But it is far from a complete description of electrodynamics and physical chemistry, and should not be treated as such.
To answer your question more directly, regardless of whether a tiny current flows through the air, there can definitely be a voltage between the terminals. Voltage is another way of describing the electric field. Wherever there is an electric field, there is a voltage difference, even in a vacuum with no matter at all! HyperPhysics shows what this looks like.
Specifically, the gradient of the voltage field gives you the magnitude and direction of the electric field:
$$\vec E = -\nabla V$$
I don't know whether a tiny current actually flows through the air, but hopefully now you have a better appreciation for the physics of the situation. :-)
Best Answer
Short:
GIVEN:
A Power Supply (psu)
Providing Vcv (V constant voltage)
with maximum current = Icc,
and providing Icc at reduced voltage when R load < Vcv/Icc
Define: Vpsu = voltage at psu terminals
(1) Placing N LEDs in parallel across the psu with no resistors will not damage any LED if current in highest current LED is <= I_LED_max_
(2) Adding resistors can increase light out:
For a load of many LEDs in parallel with no resistors,
Then if Vpsu = Vload < Vcv (so Iload = Icc)
adding series resistors per LED, but so Vload still < Vcv will increase total light out.
How much depends on several factors - see below.
Assume that means it provides Vout = 5.000...V for Rload >= 50 Ohm = Ioit <= 100 mA. Also assume it provides 100 mA at whatever voltage suits for Rload < 50 Ohms.
No. That depends on the LED. If the LED Vf is < 5V at 100 mA it will draw 100 mA at < 5V. If the LED is specified to oprate OK at 100 mA it will be OK.
Lowest common Vf for visible LEDs is ~~ 2V for red. (Varies widely in special cases).
One LED = 2V
2 in series = 4V 3+ in series = too many
So "bunch" <= 2 for proper operation.
I assume this means Itotal >> 100 mA if all LEDs operated in "normal" Vf range.
Let's put the bunch of LEDs in parallel to make sense of this
Let's say 20 x 20 mA at 3.2V spec LED's in parallel.
No from a ratings point of view.
Maybe from others
If 20 x LEDs rated at 20 mA at 3.2 V were connected to a 100 mA max power supply then they would draw 100 mA. The average current per LED = 100/20 = 5 mA.
If the LEDs are reasonably modern from a reputable manufacturer then, while they will not all draw exactly 5 mA, it is exceedingly unlikely (close to certain) that any will draw over 20 mA when 20 are placed in parallel. So, no LED exceeds its rating.
As above, if the LEDs are in parallel and none exceeds its current rating then they will be safe BUT light per LED will be uneven and light out will porobably NOT be maximised.
Because - LED efficiency in terms of lumen/mA rises as mA falls. The result varies with LED but as a guide 5 to 20% more light per mA may be obtainable at 10% of rated current than at 100% of rate current.
Then, the brighter LEDs make LESS light per mA than the lower brightness LEDs so the ones that "hog" the current make less brightness per current but take a larger share of the current. If the "spread" of currents is small and/or of the efficincy change with current is small then the light output difference will be small. As current rage rises and as efficiency range rises the loss of light due to current hogging rises.
A simplistic worked example shows the likely order of result.
Example - If you have 20 LEDs and the highest current one draws 20 mA, and each less brighter one draws 80.245% of the prior one the the total drain will be 100 mA (actually 99.99936 mA)
ie with I rounded to 0.1 mA LEDS draw 20, 16, 12.8, 10.3, 8.2, 6.6, 5.3, 4.2, 3.4, 2.7, 2.2, 1.7, 1.4, 1.1, 0.9, 0.7, 0.5, 0.4, 0.3, 0.3 mA
A spreadsheet calculation shows that light loss is only about 3%.
ie the loss of light from current hogging is not liable to be vast.
Worse is that the differences can be substantial and very visible.
I have seen lights made with a single common series resistor and 6 x parallel LEDs.
At full power the differences were not visible. At lower powers the differences in brightness were immense.
In the case of a CC supply ADDING per-LED resistors such that V_psu is stil < Vcv, the total LED brightness will INCREASE slightly. This is because power in increases as V rises, LED current is still I_psu_max, VLED rises and so LEDs operate at a more efficient point.
SO
If operating N LEDs from a current limited power supply where Isupply is much greater than N x I_LED_max_rated, then no LED is liable to exceed its rated max current value and no damage will be done.