The short answer is no. It's not linear.
Here's the long answer.
There are three kinds of heat transfer: conduction, convection, and radiation.
Conduction means heat transfer due to physical contact. In the case of a building, the conduction is linear-- the amount of energy that leaks into a building is linearly proportional to the difference between the temperature outside and the temperature inside. The constant of proportionality is called the thermal conductivity of the material the heat is passing through. For well-insulated buildings, a great deal of the heat comes in through the windows, as glass is a poor insulator.
Convection means heat transfer due to bulk fluid flow, like wind. Unfortunately, this is hard to model accurately for buildings. It's safe to say that in most places on Earth, for most buildings, convection heats up your building, but not as much as conduction. Still, it screws up your model. It's particularly a problem with cold winds in the winter, where the temperature difference can be higher than in the summer.
For buildings, radiation means the sun. The effects of radiation are roughly proportional to the projected area of your windows over the course of the day. Unfortunately, the contribution to heating from radiation can be greater than that from conduction, especially for a well-insulated house with a lot of windows facing toward the equator. This is where the real nonlinearity comes in-- total irradiance is difficult to measure without considering the geometry of the windows and modeling cloud cover as a function of time.
To add a few numbers-- on a hot night (say, 80 F), you can cool a two-car garage with 1000 W air conditioner (which is actually providing you around 3000 W of cooling, see heat pumps). At noon, the sun averages around 1000 W per square meter, so if your garage has some large windows pointed toward the sun (say, skylights on a roof angled toward the equator), your AC unit can easily be overwhelmed by radiation alone.
I expect the high-resolution display to use slightly more, but roughly the same amount of power as the lower-resolution display.
Most of the power consumed by the display in a tablet like this goes to two primary components: the backlight and the LCD.
Typically the backlight consumes very roughly 75% of the energy going to the screen.
Most tablets like this one have a CCFL tube backlight; some of them have a "white LED" backlight. It doesn't change the answer for this question -- given either kind of backlight, that backlight will consume exactly the same amount of power no matter what LCD is placed in front of it.
Turning the "brightness" down can save a significant amount of energy.
As you probably already know, the "liquid crystal display" (LCD) such as the ones in the tablets you mention act as shutters -- they either let the light through, or they block the light, or something in-between.
They typically consume the other 25% or so of the energy going to the screen.
Some of that energy goes to keeping the liquid crystals "open" (or "closed").
A cluster of 4 pixels requires exactly the same power to keep the liquid crystals "open" (or to keep "closed") as a single pixel 4 times the size.
Some of that energy is lost due to parasitic capacitance of the INO transparent "wires" on the screen.
The total row capacitance and the total column capacitance is about the same for the two screens, so the amount of energy required to update a row (charging and discharging every column line across the entire screen) is the same. However, the higher-resolution screen has more rows to update, so assuming the same full-screen update rate, it requires more power.
As a side effect of the screen having a higher resultion, the CPU and the CPU-to-display bus will have to do a little more work dealing with more pixels.
So the things that use up the most power use exactly the same amount of power no matter what the resolution.
There are a few things that require more energy for the higher-resolution screen.
So I expect the high-resolution display to use slightly more, but roughly the same amount of power as the lower-resolution display.
Best Answer
A CMOS chip ideally acts as you suggest with the dynamic power consumption proportional to the switching speed and the square of the power supply voltage (and the static power consumption with clock halted zero).
However, as you try to lower the supply voltage (for reasons obvious from the above) the transistors no longer quite turn off all the way, and an ever larger static consumption appears. This is called subthreshold power leakage and increases with temperature. There is also gate oxide leakage.
There are mitigation techniques, including circuit design and using exotic materials such as high-K dielectrics that can reduce the effect. At one time it was predicted that static power consumption could approach dynamic power consumption but I don't think that has happened.