Connecting a 10 bit camera to a CPU with 8 bit camera interface

camera

I have 10 bit camera (MT9V032) that I need to attach to a processor that has only 8 bit camera data interface. (no LVDS) What is the best way to connect these two?

My current plan is to leave data 0/1 pins floating or pulled to ground with 10K resistor. The image will not have these bits but I suspect it still will be pretty good image quality. I want to ask experts if I am missing something?

Best Answer

In using the parallel data Interface, that really is the only choice that you have for dropping bits. If you drop the higher bits you'll get severe scene degradation.

If you have additional resources (in say a FPGA) or the like, you could put a tone curve (compressive) on the data and reduce the bit depth from 10 bit to 8 bit. Since shot noise follows a roughly scaled sqrt curve you should be able to follow a fractional power and not notice the difference in the final result.