Cell phone size OLED displays are driven much the same as cell phone size LCDs. The manufacturers try to make the interface similar to reduce engineering effort required to switch technologies. This is why LCDs, in turn, are driven similar to old CRT displays. These interfaces (for the smaller displays anyways) generally are parallel interfaces with 3 clocks. The parallel data bus will be as wide as the color depth of the display. A 24 bit display, for example, will have a 24 bit wide color bus, with 8 bits representing red, green and blue. The RGB numbers will represent the brightness of each color for an individual pixel. Just about any color can be formed by varying the intensity of the RGB values. The 3 clocks are the pixel clock, the horizontal clock and the vertical (or frame) clock. The pixel clock is the fastest, and each tick of the pixel clock moves the selected pixel horizontally across the screen, the horizontal clock ticks for every new line, and the vertical clock clicks every new frame. So for a 320 x 240 pixel screen, the horizontal clock will tick every 320 pixels, and the vertical clock will click every 240 lines. This is ignoring delays. In reality, there is a bunch of delays at the end of each line and the end of each frame. In the CRT days, the delays allowed time for the ray to physically move to the beginning of a new line or back up to the corner for a new frame. These delays provided time to "hide" digital information in the old analog days of cable (like subtitles and v-chip info). Today they are still handy because they force the display driver to share memory bandwidth (as it pulls information from the frame buffer in chunks of time). You can basically think of the interface as painting one pixel at a time across the display, line by line until the image is drawn, at which time it starts all over. The pixels are generally designed to hold the information long enough to last until the next full refresh cycle (which typical occurs at least 60Hz).
EDIT: Sorry, I thought you were looking for how the interface is driven. The pixels themselves are usually driven directly by a display driver of controller integrated with the display (and so the end user usually doesn't need to worry about implementation details). I'm not an expert here, but a simplification is to represent each pixel as a diode in parallel with a capacitor. The capacitor is charged to a certain voltage, which dictates the amount of current, which dictates the brightness of the pixel. So an analog 'programming' voltage will determine brightness, but this gets refreshed constantly.
First of all, I do not know of any displays available that support what you want.
This article implies that OLED displays should be able to deliver refresh rates > 80Hz. It claims 1000 times current displays which would put it in the 60kHz range. That may be true for an individual pixel, but I think they're confusing the response time of a pixel with the ability to give a pixel a new value. The former is how fast the pixel gets to a new value, the latter is the refresh rate (i.e. how often you can refresh the whole display).
It does seem that OLEDs could do what you want, but I doubt there is a display that will allow you to select an arbitrary refresh rate.
The reason I doubt is that the way video works is that you have an image that is refreshed at a particular rate to produce the video. Electronically this is done by sending pixels out one at a time to the display at a well-defined pixel clock.
Example:
Assuming a resolution of 1920x1080 and a refresh rate of 60Hz, your pixel clock would be a minimum of 1920*1080*60 = 12.4416MHz. In practice, you clock out extra pixels at the end of each line (horizontal blanking) and after the last line (vertical blanking). This is done to allow for a more reasonable clock (12.4416MHz isn't an easy frequency to generate from standard oscillators). Blanking time is also used to send other data and/or allow the receiving end (a display in your case) to process the received data. To continue the 1080p60 example, the pixel rate for that resolution and frame rate is defined as 148.5MHz. This is a more "rounded" frequency and allows for 2200x1125 pixels to be clocked out at 60Hz. When transmitted over HDMI, the blanking periods are used to send audio and control data.
So you see that both the video source and the video receiver (in your case a graphics chip and a display) have to both know about the exact format of the data being sent in order to work together. This is why I doubt there are displays like you desire. The graphics chip manufacturer would have to support a highly variable pixel clock which basically means they'd have to put an FPGA on-board to clock out the data in addition to their graphics chip. The display controller manufacturer would have to do the same on their end. Though depending on the display the display controller manufacturer might be able to just use the input clock from the graphics card without knowing the exact timings (within the limits of the display and controller).
You could conceivably support what you want, but you'd basically be implementing a graphics card in an FPGA. You'd take whatever the original video was and then convert it to the frame rate you want inside the FPGA. You'd be limited in the maximum pixel clock that the display controller can use and you'd have to select a few discrete frame rates (i.e. maybe 80, 85, 90, 95, etc) to output based on how much logic and how many PLLs your FPGA has. So you couldn't vary the frame rate in real time, but you could support more frame rates than a typical display driver supports.
Best Answer
Almost none of the prefab display modules I've seen on the market give a "user" any significant control over when information is actually displayed; many don't even let the user know when information will be displayed. The normal design pattern is for the module to accept data from the user into a buffer at times of the user's choosing, and then display the contents of that buffer at a time of the module's choosing. In cases where both the module and the eyes of anyone looking at it are stationary, having the time between when data is fed to the module and when it appears on screen arbitrarily vary between 0-15ms won't be a problem. If the module or the user's eyes are moving, however, the scanning behavior may become quite relevant.
For a 64x64 module, I would expect that it should probably be possible to supply the module with a new frame's worth of data on every scan cycle; if you're interested in controlling or at least knowing exactly when things are displayed, however, you'll have to check the data sheets of any specific modules you're considering to see what sort of control you can get. I wouldn't say that modules "typically" give any control, but some might let you program a frame rate or scanning pattern that would fit your needs even if you have to do some trickery to achieve such behavior (e.g. if the module scans one line every 100us, allows a programmable number of scan lines, and counts scan lines with a down-counter that's reloaded when it hits zero, and if one wanted a frame rate of exactly 125Hz, one could have the scan-line count register programmed with a value of 81 for 7.809ms and a value of 80 for 101us. The timing on the module would then drift until the start of frame coincided with the time when the screen was programmed for 80 lines.