Current to power LED

currentledtransistors

enter image description here

schematic

simulate this circuit – Schematic created using CircuitLab

schematic

The Circuit is of 2 alternately flashing LED's (BXRA-C0402 — max i = 500mA).
Each of the transistors(TIP120) power 15LED's (going on an off simultaneous) How do I find out if the transistors can supply enough required current to the load (15 LED's each = 500mA*15) and if not, how much current is the circuit short of?? How can I over come this??

Best Answer

The Circuit is of 2 alternately flashing LED's

Your circuit makes no sense to me. What is the principle of it's operation?

  • Your two (sets of) LEDs are in series, they should be independent.
  • Why use an op-amp as an inverter?
  • Why don't you need resistors to limit base current to the Darlington pairs?
  • What is the voltage level of the square-wave control signal?
  • What are the values of the resistors?

Your BXRA-C0402 LEDs have a forward voltage of 9.6 V. I don't see where you are supplying that. You seem to have a supply voltage of 1V or 5V depending on which diagram I look at.

How do I find out if the transistors can supply enough required current to the load (15 LEDs)

You

  • calculate the current required to drive your LEDs (depends on arrangement in circuit)
  • look at the data sheet for the transistor concerned.

If the LEDs are in series you only need 500 mA but you need a higher supply voltage for the LEDs (15 x 9.6V).

If the LEDs are in parallel (each with own current-limiting resistors), you only need about 12V supply but need to handle 15 x 0.5A = 7.5A. A TIP120 has an absolute maximum Ic of 5A - that is not enough.

if not, how much current is the circuit short of?

See above.

How can I over come this?

Change the arrangement of LEDs or use higher powered transistors (or FETs)

For example you could use 15 TIP120s to drive each of your 15 LEDs. Or you could look for a higher powered alternative.