Electronic – Does color rendering index increase with brightness

ledlighting

I've been reading about LED strips and thinking about switching to those from my serial incandescent lamp setup, and I can't figure out if CRI (color rendering index) is tied to a certain standard brightness value (it has to be measured at some standard values, right?) or if no matter how dim or bright the light is, it will always be the same.

So, for example, if you take one strip section worth 2000 lumens and its CRI is 80%, will the CRI of the objects in a room increase if you use multiple of these strips?

I haven't had an opportunity to experiment with that and compare the stripes in a natural environment, so excuse me if the answer to that is very obvious.

Best Answer

CRI doesn't depend on brightness directly.

However, LEDs optimized for high CRI usually have lower lumen/W efficiency, and vice-versa. Very high lumen/W LEDs tend to be of the cool white variety, which doesn't have good color rendition. But high efficiency is great for flashlights.

I used CRI 95 LEDs for my kitchen lights and I'm very pleased with the result, it's much easier on the eyes than the weird tints produced by Low-CRI LEDs, although those would be OK for a corridor or a parking lot I guess.


EDIT (to complement Misunderstood's answer):

Color Rendition Index is about... color rendition. Color is the result of what the eye/sensor perceives. So, it is the product of three values:

  • Spectrum of incident light
  • Reflection spectrum of object
  • Spectral sensitivity of eye/sensor.

If your light has a discontinuous/spiky spectrum (say, fluorescent or LED), and your object also has a ragged reflection spectrum, the resulting color will depend a lot on whether the peaks and valleys in both spectrums coincide. Results can be good, or completely off, depending on luck.

A white-looking "5600K daylight CCT" HMI lamp with a spectrum like this:

enter image description here

...will have TERRIBLE color rendition. If the object's reflected wavelengths peaks fall in between the light's peaks, color will be completely off.

Eyes and cameras are good at calibrating out color temperature shifts. People still look like people under skylight (which is outrageously blue like >7000K depending on time of day and other factors) or direct sunlight (slightly yellow) or a mix, or halogen (yellow), or an overcast day, etc.

HOWEVER, this calibration only works for full/smooth-spectrum lights. With spiky-spectrum lights, the missing wavelengths cannot be resurrected by color balance calibration. It will always look weird.

CCT is about the perceived color of the light itself. It is completely different and unrelated to CRI, which is about the color rendition of objects under said lights. For the same CCT, color rendition can be excellent or dismal depending on the actual light spectrum.

Here's a terrible CCFL light, officially 3200K:

enter image description here

Green and red stuff will look bad. A nice test for these kinds of lights is (if you're light skinned) to look at your wrist. With a full spectrum light, it looks OK. With these lights, the blue peaks will emphasize the big blue veins, and the skin takes a sickly color due to the lack of red.

Now, LEDs... If they use RGB or multiple mostly monochromatic LEDs, then the spectrum will be a collection of peaks, and CRI will suck. But you can adjust color temperature, which means you can have any coloration you want, as long as it's fugly! There's a sushi bar around here which uses RGB lights. I guess the owner must be colorblind, because it does truly evil things to the salmon sushis, they look like some kind of glow-in-the dark pink stuff straight out of Ghostbusters.

More bad lights:

enter image description here

Note that CRI is a flawed measurement which uses only few colors, and LEDs tend to score better than they actually look, because the standard CRI color samples miss the usual spectrum dip in the greens between the blue emitter peak and the phosphor. Also, high-CCT LEDs have a higher blue peak, and thus worse color rendition (but more lumens) than 2700K LEDs.

Until a more accurate normalized measurement is available, the best way to avoid bad colors from LEDs is to look at the spectrum and buy ones which minimize the blue peak and the green/cyan dip.

I would buy this one just by looking at the spectrum:

enter image description here

Now, on to the specialists:

http://www.cinematography.com/index.php?showtopic=52551&p=354505