How VGA monitor detects video resolution. I'm asking this because with different VSYNC, HSYNC intervals it's possible to have different dot clock.
Electronic – How VGA monitor detects video resolution
vga
Related Solutions
EDID is used, among other things, to query the monitor for the timings it would like. It isn't used to transmit any video information.
I guess it's been so long that no one remembers, but VGA monitors back in the day had adjustments for the image location on the CRT. You could move it left or right, up or down, or scale it horizontally or vertically. More advanced monitors had additional adjustments. These allow you to compensate for whatever timing was in use, whatever local magnetic field there might be distorting the picture, etc.
Of course, what these adjustments are doing behind the scenes is adjusting the timing parameters. They were necessary because the VGA signal doesn't explicitly say how much blanking time there is. As you've noticed, there are some general conventions, and the monitor can (through EDID) advertise support for particular timings, but there's no requirement that those timings are what will be sent to the monitor.
What you can do, and what LCD monitors that still have VGA interfaces do when you press the "auto adjust" button, is guess at what the blanking time might be simply by looking at the R, G, and B signals. These should be black for the blanking time, and probably they aren't black otherwise. Of course someone could be looking at a black screen, and this approach won't work.
Similarly, I don't think there's anything in the VGA signal that will tell you what the resolution is. You can guess at it by looking for the edges between pixels, and timing them, and this is again what LCD monitors do. But remember, VGA is an analog signal, designed to be displayed on an analog device. It has no concept of "pixels".
It is easy enough to calculate all you need from just the basic provided information.
For instance, the site I use most for a reference is this one: http://tinyvga.com/vga-timing/640x480@60Hz and it has all you need for 640x480 @ 60Hz (it specifies most common resolutions, but that's the simplest to work with).
It specifies everything in pixels and lines, and it provides a pixel clock frequency, as well as refresh frequencies. All you need though is the pixel clock and the number of pixels for each thing.
For instance, it gives a pixel clock of 25.175 MHz. That is not easy for most microcontrollers to generate, since it's both high frequency and high resolution - in general you can have one of those two - high frequency or high resolution. However, 25MHz is usually easy enough to generate, and is "close enough" for most monitors to cope with.
So we have a 25MHz pixel clock. We also have a "whole line" size of 800 pixels. That size includes the porches, sync and visible area. So a line of 800 pixels, at 25MHz clock, would be running at (25,000,000/800) 31250 Hz, or one line every 32µS.
The horizontal sync pulse - 96 pixels - would be (96/25,000,000) = 3.84µS long.
We know that a line takes 32µS, and there are 525 lines in a "whole frame", so 0.000032×525 = 0.0168s for a frame, or 59.524Hz. That's pretty close to the 60Hz for the specification.
So given a pixel clock, and a set of pixel periods, you can calculate anything. Of course, you can also go backwards. Given a frame rate and a resolution you can work out:
$$ 60Hz × 525 = 31500Hz $$
$$ 31500Hz × 800px = 25.2MHz $$ So that shows that even the given specifications aren't 100% exact, but there is a bit of flexibility in the VGA timings so you can bend your clock to suit you within certain bounds.
And while we're at it, generating VGA purely with software takes a lot of processing and often leaves you starved of CPU cycles to do anything. One of the most common "tricks" for making a VGA signal on a CPU is to use SPI to generate the pixel data stream. Even better if you have DMA in your microcontroller to output an entire line of data without the CPU having to do anything. The CPU is then just responsible for generating the sync pulses and loading the DMA system with the right addresses - the rest is done in the background. Of course, that leaves you with just a 1-bit monochrome display. If you happen to have an SQI interface, and enough RAM, you could make a 4-bit display (16 colours) easy enough.
Best Answer
VGA monitors have a small serial EEPROM embedded into their circuit boards. The chip (known in the business as an "EDID" (electronic device ID)) connects to two pins on the HD15 connector. These two pins operate as an I2C bus that permits the driver software to query the monitor and find out what range of VGA picture resolution that the monitor supports.
The intended scheme is that the driver software on the host computer side will allow selection of a video resolution that is supported by the monitor. Once the monitor gets a video signal it has the capability of inferring the operating resolution by counting dot clocks per HSync and number of HSyncs per VSync. Once the operating resolution is inferred the monitor will switch itself to the best known method for itself to display that video mode.
Older monitors, back in the days of "MultiSync" CRT style monitors, may have only supported a couple of video resolutions. In some cases the video mode detect may have even been a simple R/C filter that could detect changes in the HSYNC frequency.
Newer monitors, including the plethora of LCD screens in use now, all have digital controllers in them that have built in circuits to detect video resolution. Most assuredly done by counting the pixels per line and/or lines per frame.