Calibrate RGB leds with the correct resistor
Playing around with some microcontrollers and multiplexed leds i noticed that i always need to fix the colors of each led rgb output softwareside… thats a pain…
Microcontroller's pwm output has an output of 255 different values.
By mixing a color with rgb leds you do some math… i like hsl(hsv in mc's) and convert it to rgb values.
So if i want yellow i need theoretically turn on the red and the green at 100%
nope… it's not yellow…
- here the lighness is not considered…the real rgb values should be less than 255 pwm.
all set to 255 should return white.
- the blue and the green leds are brighter.
in my code i have something like that
red*1 green*0.2 blue*0.15
i use only 15% of the blue led when mixing the colors.20% of the green one.
The maximum value of pwm used is 38
38 of 255 possible values. A waste!
Those are the resistors i need to properly power the leds.
Red: email@example.comA = 150ohm
Green: firstname.lastname@example.orgA = 100ohm
Blue: email@example.comA = 100ohm (corrected error 2,1v vs 3.1v)
How can i calibrate the leds hardwareside?
I know i need only 15% of the emitted light comming from the blue led… what resistor should i use?
Is there some sort of calculation that allows me to set the correct resistor based maybe on the wavelength or other carachteristics contained in the datasheet?
Would a Simple LDR help to calibrate those leds?
If i find the correct resistors for a nice hue based on a lightness of 50% what would return rgb(255,255,255) ??? white or not?
What you do to get a nice visual hue on rgb leds?
this for shure would also help to solve some issues on another question i made some time ago