Electronic – use a 12MHz for NTSC video generation

crystalfrequencyntsc

I want to use a shift-register to generate video signal directly from SRAM. And researching I found that the pixel rate of NTSC is 12.272MHz and I can't find any crystal closer to that speed than 12MHz. That also makes easy to generate the memory addresses on the memory.

I want to know if modern TV sets can compensate for the offset. I won't be using the color subcarrier. Only grayscale.

Best Answer

Depends how clear you want the image to be on an LCD, and whether you're using that same crystal for sync timing or just for reading pixels out of a buffer.

The 12.272MHz you reference is an older standard for digitising NTSC video specifically, before MPEG came along and everyone standardised on 13.5MHz instead (which works equally well for PAL, and meant that the same crystal and pixel widths could be used globally instead of having twin 12.27 and 14.75 crystals and frames that had to be rescaled horizontally - a difficult job - as well as - more easily - vertically and temporally). It's equal to 640 pixels across most of the active width of the screen (52.15us to be exact, leaving about 7 pixel times unsampled right at the very edges out of a full 53.33us window), and, as it's derived from the NTSC colour clock (24/7ths of 3.58MHz), also gives us exactly 780 clocks per line so it's easy to use it for timing purposes as well. This was even adopted by a small number of Japanese home computers in the late 80s/early 90s to give a TV-compatible high-rez overscan mode based off a 24.545MHz crystal (= twice 12.272) and counting 1560 clocks/line instead.

However, that standard switch did take place an AWFUL long time ago - like, 25 years or more, with only very early video digitisation gear hanging on to the older frequency. I'm impressed you managed to somehow find that standard but not the more modern and far more widespread 13.5, in fact.

Another thing to bear in mind: the active period of a line is NOT what's typically displayed, and includes quite a bit of overscan. If you want a display area similar to that used by, e.g. the classic Amiga line of computers, you need to limit your output to no more than 45us per scanline, roughly centred within that window. Which at 12.272MHz is only about 552 pixels instead of 640. If you push that out to an Apple II-like 560 pixels, that's 45.6us and will be almost but not quite kissing the bezel at the wider parts of a CRT TV, and will likely lose a little content right in the corners due to tube curvature. If you want to definitely overscan but by the minimal amount possible, then you need to ape a lot of 8-bit (and some early 16-bit) consoles and produce an image in the 47.5 to 48us range (for those, 256 pixels at 5.37MHz), or about 584 pixels. I'm not sure what the limit would be with an LCD, but typically they show a bit more of the overscan area than a CRT, especially the widescreen models; I think I remember pushing Amiga overscan out to a little over 700 pixels (14.32MHz clock), which would be about 600 in your case. Essentially if you wanted a background that overscanned in all cases, but content that was viewable on all screens, you'd want to generate at least 608 pixels of BG but limit your actual meaningful data to the centre 544. Maybe 512 within 640 to be ultra safe.

But, unless we're going to start ripping apart old machines, we don't have a 12.272MHz crystal available, do we (NB. have you tried looking for 24.545 and using a /2 clock divider chip?)... So, adjusting things for 12MHz...

First up, we might have a timing issue. 12MHz doesn't divide cleanly into the 15.75kHz that's the actual line rate for monochrome, nor the 15.734...kHz of colour. If you can clock it at 762 pixel times per line, you'll land somewhere in-between, 15.748kHz, which I would expect most screens to still accept, including LCDs. If your timing options are rather more constrained (e.g. it has to be a whole number of character spaces, which works for 6-pixel-wide cells but not 7, 8, 9, or even 12) then your line rate is going to end up varying from the standard somewhat. Additionally, are you producing progressive or interlaced output? CRTs generally cope with 15khz progressive quite well but some LCDs supposedly go a bit funny (though I've not yet personally encountered a problem hooking up classic consoles - which are almost universally progressive - to them), and your framerate will vary slightly depending on which you choose. The standard is either 60.000Hz monochrome or 59.940Hz colour, but 15.748kHz progressive (262 or 263 line) will be 60.107 or 59.878Hz, and interlace (262.5 line) 59.993Hz. Again, the variance is small enough that most sets should be absolutely fine, but there's no guarantee because it IS nonstandard.

After that, your pixel counts will be off. A full 53.333us window at 12MHz gives us that same 640 pixels again, so you will just avoid overrunning into the blanking and sync area if you use that width. However it might be safer to use a narrower line even so. The 12.272MHz standard equates to more like 624 pixels at 12MHz, the slightly overscanned width used by 8-bit consoles is about 576 pixels, and the Amiga active area is more like 540 pixels. Again, to be absolutely safe, I'd say it would be better to use the central 528 or 512 instead. (A more Atari ST or C64-like window would be around 480 pixels wide). So you can still get 80-column text in, so long as you use a 6-pixel font, or accept that the full width might only be visible on LCDs if you use a 7-pixel one. (8-pixel adds up to 640)

However this leads us to the bigger problem: pixel sampling. Any modern digital screen you connect it to will likely use a 13.5MHz sampling rate and therefore end up blurring occasional columns of pixels (basically, you're running at 8/9ths of what it expects, so every 8 pixels you produce will be sampled using 9, so out of every block of 8 the 4th and 5th ones will end up smeared together, and all but the 1st and 8th will show some blurring), which can harm text readability unless you have a fairly wide font; 6-pixel text will be murdered, 8-pixel will be reasonable, 12-pixel should be fine.

The question begged by all this, however: Is there any reason you can't just use 13.5MHz instead? That's both the de facto and de jure standard for digital sampling of standard-def video in both NTSC and PAL domains, and if you base your timing on that there shouldn't be any compatibility issues, plus text rendering on flatscreen TVs should be pin-sharp (or at least, as sharp as their own internal rescaling of the sampled image to fit the panel dimensions will allow). It's even used as the basis for HD sampling, via suitable scale factors, so it will retain said compatibility on the SD inputs of HD screens, and you can easily modify your design to output HD if desired. The crystals required should be as easy to get hold of as the colourburst ones are (...or at least, were), because they're extremely common components used in almost every digital device with a TV-standard output which doesn't just use a PLL, and end up being co-opted to clock a variety of other circuits as a result, just like the colourburst clocks were. (NB. it may actually be easier to find 27MHz or even 54MHz, as the 13.5 clock is usually divided down from one or the other, because the higher frequency then allows a greater range of slower clocks to be produced; same as how CB crystals were more commonly 14.318 or 28.636MHz instead of 3.58 or even 7.16)

If your circuitry can't run much faster than 12MHz, and certainly can't stretch to 13.5, then you could subdivide to 6.25 and still produce a reasonable display if your demands aren't too severe. Or divide 27MHz by 3 and produce a 9MHz signal, and accept that every other on-screen pixel will be formed of two generated pixels blended together (but should remain readable as all the original clear pixels will still appear in some way).

Timings for 13.5MHz run as follows:

Clocks per line = 858 pixel clocks (13.5/858 = 15734.27Hz, exactly 15750/1.001 so perfectly compatible with NTSC colour, although not specifically monochrome)

Active window = 720 pixel clocks (= 53.333us; ie, 9/8ths of 12MHz)

12.272MHz active period equivalent = 704 pixel clocks (and funnily enough, 720 and 704 are the two main choices for MPEG2 video width, particularly on DVD)

~700 pixels on an Amiga equivalent = about 660 pixels, IE this is about the limit of what will actually display on a widescreen LCD.

8-bit console just-overscan = 640 to 648 pixels. IE you should just about be able to see 640 across at the widest point of a typical CRT TV, and that width should render with just a small border on a widescreen LCD.

Amiga or equivalent computer just-underscan = 600 to 608 pixels. You have a fighting chance of this entire width being visible across most of the height of a normally-adjusted CRT (say the central 192 lines progressive/384 interlace), maybe just losing a tiny amount in the corners (around the extremes of a 200/400 line output).

ST/C64 or equivalent "fat border" computer underscan = About 544 pixels. You should therefore be able to absolutely rely on 560 or 576 being visible across the entire height of any given CRT (to at least 216/432 lines, or 224/448 if it displays that many; the commonly-given "240/480 lines" is the overscan height and you will not be able to see all of them even on an LCD), and 512 looking somewhat narrow.

Lines per frame = 262/263 progressive, 262.5 interlace (60.054/59.826, 59.940Hz)

FWIW, 13.5MHz is not a particularly high memory speed for reading video data even by 1980s standards. You should be able to produce 640 pixels of 4-bit greyscale (=16 shades, which with dithering can produce a fair illusion of photographic quality on a TV, so long as you've turned the colourburst signal off) using nothing more than Amiga-1000-grade (so 1984/85) technology, so long as the video memory is dedicated to the task; or maybe Amiga 3000/1200 grade (early 1990s) if you want full 8-bit greyscale plus the ability for a CPU to manipulate the video memory on a 50/50 shared access basis.

Your system would actually only need to run at a nominal 6.25MHz, and take 2 cycles for a memory access over a 16-bit wide bus. You then feed that 3.125MHz, 16-bit data stream into a shift register with taps at every fourth bit and shift it out at 13.5MHz... the data has to be bitplane interleaved, preferably as entirely separate pages or at least line-by-line for simplicity's sake (and with an intermediate 64-bit wide register that allows you to buffer 4x 16-bit chunks then drop them into the shifter in a single clock tick), which makes writing it slightly more complicated, but that wasn't an insurmountable problem for the engineers of the 1980s so you shouldn't have much of a problem. And if you're only producing pure monochrome output, ie black or white with no greys, you can run even slower memory and a simpler shifter setup. 8-bit wide at 3.125MHz clock (=1.0625MHz byte rate), or 6.25MHz if you want to maintain CPU access to memory during the active period, dumping its data directly into a basic parallel-to-serial converter would be sufficient.

Another thought: If you'd like to have colour output, why not take the CGA route and use a 14.318MHz crystal governing the monochrome (or greyscale?) output, which will be fast enough to show 640+ pixels across a typical CRT (and 700+ on a widescreen LCD), and, in concert with a colourburst sync signal (also generated from that same clock) can produce reliable NTSC artefact colour at an effective resolution of 160 pixels (with quarter-pixel placement resolution)? The available palette and the techniques for generating such are well known and can be researched without much difficulty, and writing the necessary data into memory is not really any different than storing the bit patterns for regular 4-bit chunky (or, with a bit of finagling at the output stage, 4-bit planar) direct colour output. Unless you disable (or skip) colourburst generation in your output stage, you'll likely be getting unwanted colour artefacts anyway with such high-resolution monochrome material...

NB. All of this of course is only directly relevant to NTSC areas; PAL uses the same 13.5MHz clock for digital but with a slightly smaller active window (equal to 702 pixels, generally just rounded up to 704), though the visible width is essentially the same, and a longer overall scanline (864 clocks = 15.625kHz) as well as a higher line count per frame (312, 312.5 or 313 lines = 50.08, 50.00, 49.92Hz) making for a greater visible height (typically 256 on a normally adjusted CRT, out to maybe 272 on an LCD, with an overscan total of 288). And its colour standard is different, using a 4.43MHz colourburst instead (meaning a rather faster general-purpose 17.73MHz master clock) plus an inversion of the carrier phase on each line (cancelling out any "unwanted" colours produced by unmoving interference), which makes producing artefact colours a much harder game and one that's generally avoided on the software/pixel generation side (instead being left to a dedicated composite video encoder that acts on a normal distinct-colour input signal).