Electronic – HDMI and I\$^2\$C

digital-communicationshdmii2c

I was having a look at the HDMI pinout and I thought: why would they use I\$^2\$C for the display-host communication? My question here is about the design metrics that lead to this choice.

HDMI is a quite recent standard while I\$^2\$C is around since 1982. I\$^2\$C is meant for on board, chip to chip communication, moreover the standard allows multiple devices attached to the same bus. An HDMI cable can be long some 15m, so the I\$^2\$C signal should probably use higher than normal voltages to avoid too much noise, adding the necessity of tranceivers on both sides. About the multi device thing, I can't really think how you would attach more than one monitor to a single HDMI port unless you are being very, very non standard.

I'm really not an expert in communication protocols but I think that RS485, CAN or some other point to point, full duplex, higher SNR protocol would have been better.

So why would they choose I\$^2\$C?

note: I know this might be marked as "opinion based", I am hoping that somebody around can think of/knows about some objective reasons.

Best Answer

DDC history in HDMI goes via DVI all the way down to VGA. It is implemented in a way that you can simply hook up a standard I²C EEPROM memory chip on the monitor side, which are almost as cheap as dirt (AT24C01 and compatible).

I2C signal should probably use higher than normal voltages to avoid too much noise

Nope. The +5 Volts tell you a different story. What they might do is a lower clock frequency on the bus. HDMI cables are usually shielded well, too.

So why would they choose I2C?

It was there in DVI (which HDMI is compatible to) and works and is cheap.