Electronic – How VGA monitor detects video resolution

vga

How VGA monitor detects video resolution. I'm asking this because with different VSYNC, HSYNC intervals it's possible to have different dot clock.

Best Answer

VGA monitors have a small serial EEPROM embedded into their circuit boards. The chip (known in the business as an "EDID" (electronic device ID)) connects to two pins on the HD15 connector. These two pins operate as an I2C bus that permits the driver software to query the monitor and find out what range of VGA picture resolution that the monitor supports.

The intended scheme is that the driver software on the host computer side will allow selection of a video resolution that is supported by the monitor. Once the monitor gets a video signal it has the capability of inferring the operating resolution by counting dot clocks per HSync and number of HSyncs per VSync. Once the operating resolution is inferred the monitor will switch itself to the best known method for itself to display that video mode.

Older monitors, back in the days of "MultiSync" CRT style monitors, may have only supported a couple of video resolutions. In some cases the video mode detect may have even been a simple R/C filter that could detect changes in the HSYNC frequency.

Newer monitors, including the plethora of LCD screens in use now, all have digital controllers in them that have built in circuits to detect video resolution. Most assuredly done by counting the pixels per line and/or lines per frame.