Electronic – Why do displays have limited bit-depth

computer-architecturedigital-communicationsdisplaygpu

As far as I am aware,

  1. HDMI 2.1 does support 12-bit 4K 60fps, Also it doesn't use TMDS, rather FRL. Sends upto 48Gbps

  2. GPUs can do calculation in fp32 and from some reference, I think that it can send 16bpc (48-bit per pixel deep color) (Assuming that it is a high end Quadro GPU) (fp16 had the capability to send 12bpc 36-bit deep color, right?)

But there are monitors that don't support more than 10bpc 30-bit depth deep color

Why is that? Why do displays have limited bit-depth?

Can you give me some idea about the science behind this? Why does it cost more to produce higher bit-depth monitors?

Isn't it like, RGB pixels on the displays are emitting photons? Then how do different displays differ in terms of emitting photons?

Best Answer

Can you give me some idea about the Science behind this? Why does it cost more to produce higher Bit-Depth Monitors? Isn't it Like, RGB pixels on the Displays are Emitting Photons? Then how Does different Displays differ in terms of Emitting Photons?

One of the things no one else has mentioned so far is the effect of gamma. You mentioned that you use 32 bit linear floating point, but the values that HDMI outputs are gamma corrected (spread nonuniformly in amplitude so that the difference between brightness levels gets larger as you approach maximum brightness), so if you want to show 10 bits, you actually need a panel that can change the number of photons it outputs by a lot more than just 1023 times.

That is a really tough thing to do. It means you have to have a pixel that can be very bright, and also very dim in a precise way. If you look into HDR video specifications, they either require very expensive panel technologies or use tricks like locally dimmable backlights. Even then, at a per pixel level, a lot of those technologies still might struggle to show you even 9 or 10 real bits.