Electronic – Horizontal overscan on NTSC

crtntsctv

How much horizontal overscan is there on a typical NTSC CRT television set?

Vertical overscan, there are nominally 240 scan lines per field, of which you can reasonably count on having slightly more than 200. 200/240 = 0.83, so you can use 83% of the nominal screen height.

Horizontal overscan, there are nominally 228 color clocks per scan line. If the same ratio were to apply, 228*0.83 = 189, so you could count on having about 190 color clocks per scan line.

However, https://en.wikipedia.org/wiki/Atari_8-bit_family#Playfield_graphics_capabilities says one family of computers that was highly regarded for its graphics capabilities in the age of NTSC, considered 192 to be into overscan, and 160 color clocks per scan line to be the normal safe value.

Which figure is correct?

Best Answer

This is a pretty easy one to answer. The "active window" of an NTSC video line is defined as 53.33 microseconds (at least, that's the most commonly stated one I've seen; different sources also state 52.5, 53, and 53.5). To get your maximum displayable horizontal pixels (e.g. if you have the computer connected to a monitor which allows adjustment of the horizontal width), you just multiply that by whatever its pixel clock may be, in MHz.

For example, the Amiga runs at 7.159MHz in low rez (the one commonly used for games). This means, theoretically, you could display 382 pixels in a maximally overscanned line. In reality, the machine's own hardware limitations cap out somewhere around 372 to 376 pixels depending on model (the actual output hardware is turned on and off within the stated NTSC limits, and by that point you've consumed all the "spare" DMA slots for hardware sprites so there's no memory bandwidth left after considering that reserved for audio, memory refresh, and disk access), but you were never really ever meant to go that high anyway because no regular screen was expected to be able to show that much. The amount it can scan still adds up to 52.0 to 52.5 microseconds, which nudges the limits of the standard.

The amount that you can actually see within underscan, as far as I've seen with any CRT I've plugged my A600 into, is something short of 336 pixels at the widest part of the tube (measured using a Workbench Control Panel gadget that allowed you to adjust how much overscan the desktop, and any apps running under it, could use, and how they were centred). The vanilla standard 320 pixels fit fairly neatly into the curved corners of the tube at the top and bottom of the scan. These equate to about 46.9 and 44.7 microseconds. So your "safe" scan area is about 45us, and the minimum to guarantee complete overscan with a properly centred image is about 47us.

The amount of "available" overscan in that case could be defined as anywhere from about 5us to 8.33us, or 36 to 60 pixels... which isn't guaranteed to be centred, fwiw.

This example is applicable to quite a lot of other classic machines, including the mentioned Ataris, Apple II etc, as they all share in common with the Amiga the idea of using the NTSC colourburst clock as the source for all their system timings, including the pixel engine. High-rez on the Atari 8-bits is essentially the same as that of the Amiga. The narrow 128/256-pixel mode guarantees fitting within the underscan of even the most poorly adjusted TV (and also increases overall system performance somewhat by reducing memory contention and the amount of video data to be modified with high-action games), the middle and most commonly used 160/320-pixel mode gives good resolution and should fit neatly within the visible limits of most properly adjusted TVs, and the widest 192/384-pixel mode completely fills the available "active window" giving you the maximum amount of horizontal data if you use an adjustable monitor (and, on TVs, absolutely guaranteed overscan). Technically, I don't think it even displays the full 384, for much the same reasons as the Amiga, and also to prevent causing any disruption to the sync part of the composite signal, but it comes close.

(The other reason for using the NTSC clock is to both make colour generation simpler, and keep the visible pixels aligned and in-sync with the actual composite colour cycles; it's vital to the colour production of the Apple II, at least in its higher resolution 280- and 560-pixel (39.1us) wide modes, which are revealed as actually being patterned black-and-white when viewed on a monochrome monitor, the same as with CGA hi-rez on the IBM PC. And the Amiga's high res mode may have colour artefacts when viewed on a composite screen, but they do at least remain static other than a 30Hz half-clock/2-pixel-aligned flip-flopping side to side, so aren't as intrusive as one might first expect)

On LCDs, the visible width is somewhat wider, but I believe the overscan remains the same. My own LCD TV can display somewhere around 352~360 pixels from the A600 in low rez, so a good 49.2 to 50.3 microseconds (probably 50 exact?). Meaning the useful underscan is increased by about 1/8th, but the available overscan run-off (e.g. for large sprites) is considerably reduced, to maybe as little as 12 pixels, and those could be divided unevenly between left and right.

(Mind you, that was running the machine in PAL mode; the active window there is definitely just 52us (or in some sources, 52.5), and the line rate is slightly lower. Also, the pixel clock is slightly reduced, by about 0.9% - meaning technically it can only display 368~372 pixels within the "legal" limits. So you may want to increase the above observation-derived (rather than mathematically) pixel counts by about that much to get the likely NTSC-clock figure... but that only means about 3 extra pixels total across the entire width, and the counts were approximate to start with. FWIW, switching between PAL and NTSC modes when connected to a CRT didn't do anything to the width or horizontal positioning, just the vertical height and how much it flickered)

Another shaky proof of this is the maximum amount of overscan that can be produced from an Atari ST. The clock in that case is a hair over 8MHz, and the cleverest demo routines that trick the hardware into showing more than the default 320 pixels (the machine lacks any built-in overscan abilities), which keep the output turned on as long as possible into the right hand border but are limited in how early they can glitch it into turning on in the left border, output about 410 useful pixels that can be revealed with an adjustable monitor (or a really well programmed emulator). That makes for an absolute minimum of 51.2us useful line length, and that's not counting the background-colour border that already appears a few pixels to the left of the "useful" active area.

Of course what we come down to at the end of it all is that your initial calculation, based off a rather incorrect yet popular statement of vertical overscan, is actually quite close to the truth. 53.33 microseconds, producing 382 pixels at 2xNTSC clock, means... 191 colour clocks. And 190 clocks would be 53.1us, close enough to "53us" that the slightly narrower width would still sort-of count. The "text safe" area of the screen is about 160 clocks, and you will almost certainly hit overscan if you scan 168 well-centred clocks. (The VIC20 ran a direct colour clock for its pixel output, and its default 176-pixel / 22 character wide output tended to lose the sides of the leftmost and rightmost characters)

FWIW, the vertical figure is open to interpretation. 200 is what's considered "safe" in a lot of applications, though some manufacturers went for 184, 192, 208 or 212 instead depending how confident they were. Generally most of those were readable, at least in the middle of the top and bottom lines, though the edges might have gone missing in one or more corners. It's generally considered that you need a minimum of 216 lines to guarantee hitting overscan at both ends of an NTSC screen (much like adding 16 pixels to the width), and 224 is more consistently reliable for that. As demonstrated by several games consoles that attempted to, and generally succeeded in filling the entire screen to or just past the edges with a 256x224 or 320x224 screenmode (with an appropriately reduced pixel clock, below 7MHz), whose very outer limits could be shown to often contain garbage or "hidden" sprites in emulators or on monitors with the image size crammed down.

But in any case, the calculation was formed on false data, so it's really only a coincidence that it came out "correct" for the horizontal scan. The full size of an NTSC field is 262.5 lines (or for the kind of hardware we're talking about, 262 or 263 lines, as interlace, thus half-width lines, isn't used), of which 240~243 lines (per field = 480~486 in interlace) can be filled with active image data, forming the actual overscan area. The overscan takes up about 92% of the entire field, and the visible area about 79%. Obviously, this leaves 8% of the field completely unusable for either visible or hidden data, and this also has to be accounted for in the horizontal, which is maybe where the initial confusion between "available" and "used" colour clocks came from.

Because the line rate is so much higher than the field rate, proportionally more of each line has to be given over to "useless" blanking/sync where the TV hardware itself is returning the beam to the left-hand side of the image (not a physical thing, but there's still a limit on how fast it can happen, as it involves completely reversing the polarity of a fairly high voltage circuit holding a lot of charge - which is why the "flyback transformer" is usually by far the heftiest thing in a CRT outside of the tube itself), so out of a total 63.555us per line, 10.222us - or 16% - is unavailable. Meaning about 84% of the total line time is available for transmitting "useful" image data, and only about 72% is even visible on a normal TV. Or if we multiply those numbers by the total number of colour clocks per complete line (the clock keeps running regardless, and it's useful as something to base the line sync on as it reduces circuit complexity and keeps the clock cycle positions along each successive line dovetailed in chequerboard fashion, reducing the obviousness of their regularity and making everything look slightly higher fidelity), IE 227.5 (NOT 228, unless you're using a very rudimentary machine that doesn't produce an entirely correct HSync rate), that gives us ... 191, and 164. Quite neat.

(FWIW, 79%/92% = 86% of the usable vertical lines (NOT "total" lines) should be visible, and 72%/84% = 86% of the usable horizontal width (NOT "total" width). It's probably no accident that these two coincide, and it's likely meant to be a nominal 85%, or about 7.5% overscan border on each edge, allowing for quite a lot of variation in how a screen is adjusted, things that shouldn't be visible momentarily encroaching on the edge of what the cameraman and editor can see in the studio, cheaply made screens experiencing shrinking and growing of the image with the average brightness of the picture, etc, without allowing too many of the cracks to show, things from the even more conservative (about 75% of usable height/width) "safe" area disappearing off the edge, or even sync to be lost. Things are rather different in the digital age - often a flatscreen will show all 720x480 pixels of a broadcast image, or where they don't, the overscan is quite thin, maybe on the order of 5%, because the decoded picture can be very precisely mapped to the actual display panel, and the image can be somewhat larger on the studio side and easily digitally cropped and resized (something that's much harder to achieve for analogue video) before being encoded for broadcast, so there's less need for overscan to hide studio equipment or leave expansion room to stop text from running off the edge. In fact, for Hi-Def, it's exactly 5% - 2.5% on each edge - with at least some equipment; I've had to fiddle with the service menus of a Full HD projector before to stop it trimming off the edges of a 1080p computer display in a lecture theatre, and the size it was preset to actually scale to full size was an otherwise rather strange 1824x1026... which is 1920 x 95%, and 1080 x 95%...)

(double-fwiw, the "256"-useful-line claim for PAL on the Amiga seems slightly closer to the limits, and really 240 would have been safer, as well as making more sense in terms of the comparitive line counts and framerates, as it disappears slightly off the top and bottom of a CRT - though I can use almost the maximum 283 lines on an LCD. Official usable line count is either 288, or 287.5 (ie, 575 in interlace) depending on how old a standard you comply with, and 312.5 total lines (thus 312/313 progressive). So nominal "usable" height is again 92% of the total, and fully visible height again about 79%, making the same 86% visible out of the usable scan. Line width is, funnily enough, slightly longer at exactly 64us, even though the active window is narrower... and the colour clock for true PAL encoding is completely different, at 4.43MHz instead of 3.58MHz, but very few if any machines make use of that - the Amiga PAL clock, for example, is just the closest frequency that could be found to the original NTSC one by derivation from the PAL colourburst... ending up as 8/5ths of the actual composite colour rate, in contrast with the NTSC system clock being 2x that of colourburst, so the close relationship between that and the pixel alignment is largely lost, and particularly the RF output of PAL machines tends to look pretty fuzzy anywhere there's strong colour because the clock transition happens partway though a pixel rather than in-between two of them... If they actually used the clock properly, and ran at e.g. 2x PAL colourburst, you could fit about 464 pixels into the active window, as there are, to be exact, 283.7516 colour clocks per line. That odd figure being constructed of an NTSC-like 283.5, but then modified with an additional quarter-clock to produce the line-by-line "Phase Alternation" that's the PA of PAL, and a 25Hz full-frame phase inversion, all of which adds up to a colour system that inherently averages-out the phase differences that cause weird colour shifts on NTSC. But it does mean that actually you can't get an exact 1:1 pixel to clock relationship anyway because it would shift by half a pixel per line anyway, but also a further 4/5ths of a pixel from the top to the bottom of the visible scan height... so I guess you may as well just set an arbitrary fractional multiplier and have done with it...)

OK this is a bit sprawling, I hope you can forgive that, I kept thinking of new things / coming to realisations as I went along... there's relevant knowledge in there if you can be bothered reading ;)