Electronic – What’s the difference between a thermal imaging sensor and an infrared CCD

cameraccdthermal

I was always under the impression that thermal radiation is essentially infrared energy, so what's the difference between a full thermal imaging camera (e.g. an FLIR) and a standard infrared CCD chip? Surely the latter should function as the former? Yet the price difference is astronomical.

Best Answer

CCD's are made from Si which has a bandgap of 1.12 eV. This means that it can sense a limited amount of thermal radiation at about ~ 1 um wavelength or shorter. This is called Near IR or NIR. Thermal sensors in the meantime, sense thermally emitted radiation ~ 10 - 14 um wavelength (this is the radiation emitted by a warm body at ~ 300 kelvin). The photons associated with a thermal IR scene are 10X as long and therefore 1/10 th the energy of a NIR photon (1 um vs. 10 um). For a direct bandgap detector, Like MCT (Mercury Cadmium Telluride) these must be cryogenically cooled (77 Kelvin) or the detector will get swamped by it's own heat. There are bolometer based sensors that are less sensitive. But a FLIR is specifically a MCT detector. MCT as a detecting material is very expensive with low yields.