Electronic – Voltage controlled voltage source

ledmicrocontrolleroperational-amplifierpwmtransistors

I am designing a high powered LED driver controlled by a microcontroller.

Now it would be very easy to make a constant current limiter so I don't burn out my LED. I have a bunch of LM317's and putting a resistor between ADJ and OUT gives me a circuit that limits the current.

I can then hook this up to a transistor in series, and use my microcontroller pin to switch it via PWM and get my LED controlled that way.

However I wanted to go a little further to see if I can run my LED's directly and not via PWM (i.e. switching them full on and then full off). This is likely to introduce more complexity into my circuit but I think I would learn something if I can get it to work.

Initially I was designing a variable current source (see here) but I have my LED's on hand now and after some testing I've discovered that they are quite bright even at very low current. It seems to me that linearly controlling voltage via the microcontroller would give me better control over brightness.

How can I provide my LED's with a PWM-controllable voltage? For instance I can get a precisely controlled steady voltage from PWM with a lowpass filter. But I obviously can't supply 1A or more that way. If I make a voltage follower with an op-amp, would I need a specific op amp that is rated for that current? (I have some op amps on order but they are "general purpose")

Could I put the lowpass filter on the side with the load current? (I had always been thinking about connecting the lowpass filter to the PWM signal rather than to the other side of the power transistor)

Any advice that is related to this topic is welcomed

Best Answer

You don't want to control LEDs with voltage. It may appear to you that you get better control with voltage because a small voltage change causes a large apparent brightness. With current control, the LED brightness will be roughly linearly proportional to current. However, humans perceive brightness logarithmically which is why it may look to you like current control isn't working as expected.

The voltage to get a particular current and therefore brightness will vary between LEDs and also has a significant temperature dependency. You really want to control LEDs with current, not voltage.

You also seem to be asking about a linear control from a higher voltage such that the extra voltage times the LED current gets burned up as heat. PWM is used because it's more efficient. You design the circuit for reasonably constant current at the maximum the LED can handle, and have just enough voltage to guarantee this current. That means the system is pretty efficient at that max current. PWM then switches between that efficient on state and the off state, so the result is still efficient even when the current is half the maximum, for example.

However to answer your question, you can control the LED from a voltage that is low pass filtered from a PWM output. You have to put some kind of buffer or amplifier between the filtered PWM output and the LED. If I had to do it this way, I'd use a pass transistor driven by a opamp. The trick is to put a small low side current sense resistor between the LED and ground, and have the opamp make the voltage accross that resistor proportional to the filtered PWM signal. That will give you true current control.