Electronic – LED current driver

constant-currentledled-driver

I am trying to regulate the current across an LED so that I can tightly control its intensity since I will be using this light as a source of illumination for a sensor.

I am using a CAT4002A-D LED driver, in the configuration shown in the schematic below.

LED driver circuit

I am powering it using 5V from a lab bench power supply, so I used 3*100 ohm resistors to drop the excess voltage.

Now when I set the Rset to 3.74kohm, as per the datasheet I should be getting 20mA whereas I get only around 4mA which is strange since the current drop in series and the resistors shouldn't affect that!!

But when I remove that resistance and use 3.3V as VDD (to prevent the LED from blowing up) then I get current as expected.

I don't understand why should a series resistor cause the current to drop?

Best Answer

Consider what you are doing from a simple perspective:

  1. Your power supply is 5 volts.
  2. Your series resistance is 300 ohms.

That means the most current you could possibly have through the LED is:

V = I * R
I = V / R
I = 5 / 300
I = 16.7mA

This is below your set point of 20mA. So is likely the reason the constant current supply is operating in an unexpected manner.

Also, the constant current chip specifications state it should operate up to 5 volts. And there are no LED limiting resistors in the specification's example application circuits. It follows you do not need them in your circuit when power it with 5 volts.