I am reverting here for some advice on a LED panel circuit that I am trying to build as I cannot find any concrete answers anywhere else online. My apologies as my understanding of electronics is very limited and I lack some of the basic understanding around voltages and current and how this is consumed by components. This is why I am reaching out to the experts.
The complete circuit that I want to build will have 100 LEDs. The specs of the LEDs are as follows from the datasheet:
Forward Voltage (VF) – Typ: 3.30V – MAX: 3.80V
Forward Current (IF) – Typ: 20mA – MAX: 25mA
The datasheet can be found here if it helps at all:
My source voltage is a 5V 7A power supply.
Im not sure if I have the right forward voltage value from the datasheet though. I built a test circuit with 10 of these led's in parallel and they all lit up quite brightly. The test circuit got its power from the 5V pin on a raspberry pi which outputs 5V at about 200mA. When I measured the output on these pins using a multimeter before the LEDs were connected I got a reading of 5.22 V and after I connected the 10 LEDs in parallel I received a voltage reading of 5.17V indicating a voltage drop of around 0.05V for 10 LEDs. This is where I am a bit uneducated as the way I understand it is that if I have 5 volts, I will only be able to power to LEDs with a forward voltage of 2.5V each and it baffles me as to why a circuit with 10 LEDs with a 3.30V forward voltage will light up so brightly on a 5V power supply.
What I would like to accomplish is to split the 100 LEDs into 5 rows of 20 where each row runs in series on its own resistor (just one resistor) due to the fact that the space on the circuit that I have is very limited. I then take that one row of 20 LEDs in series and duplicate it 5 times which gives me the 100 LEDs. I guess you can then say I have 5 isolated LED circuits all connected to the same power supply.
I have tried calculating the resistor value that I need to run all 20 LEDs on online calculators but they all provide me with a parallel diagram where each individual LED has it's own resistor which is not what I want. I want to run all 20 LEDs on one resistor so I need to calculate which resistor I need to do that.
My questions basically sums up to this:
- How can I calculate the resistor needed to drive the 20 LEDs in series with all 20 LEDs using only one resistor?
- I have 5V 7Amp on the power supply available in my project and at 25mA the entire circuit should utilize 2500mA or 2.5 Amps which should be fine. Will the 5V be enough to drive the 100 LEDs at 3.5 to 3.8 forward voltage per LED though? This is where I lack a basic understanding of how it works. I do have access to a 12V 3A power supply in the same project if this is needed.
- Will it be a problem if I build the 20 LED circuit, duplicate it 5 times and connect them all to the same power supply?
- Is it necessary to use a resistor at all with all these LEDs running on the same supply?
Thanks for the assistance.