For a device you will often see a figured called \$\theta_{JA}\$. This is called thermal resistance.

This tells you that in a typical ambient environment for every watt dissipated, the device will heat up *x*°C above ambient. You must include ambient temperature into your calculation. In an open lab environment, it might be 25°C but in reality inside the casing of some electronics it can be much hotter.

If you add a heatsink you need to know \$\theta_{JC}\$ (junction-case resistance), \$\theta_{CI}\$ (case-insulator resistance, if any), \$\theta_{IH}\$ (insulation-heatsink resistance, if any), and finally \$\theta_{HA}\$ (heatsink-ambient resistance.) Like normal electrical resistance you can add these together to get a final figure for how much your device will heat up when it dissipates *x* watts.

**In general:**

Take whatever physical system to an extreme, and all the simple models which were developed by engineers will break apart.

**Simple model for active power dissipation:**

The statement about an exponential increase in heat dissipation at extreme overclocking is not consistent with the following equation:

$$P_g \propto C_gV^2f$$

but how the above equation was derived?

Well, it is based on the following simplification:

^{simulate this circuit – Schematic created using CircuitLab}

This model assumes that:

- Transistors behave like an ideal, mutually exclusive switches (no overlap in time when both switches are ON)
- All capacitances may be represented as a single equivalent capacitor at the output
- No leakage currents
- No inductances
- More assumptions

Under the above assumptions, you can think of inverter's (or any other logic gate's) action as of charging the output capacitor to \$V_{dd}\$ (which consumes \$\frac{1}{2}C_{tot}V_{dd}^2\$ Watt from the power supply), and then discharging it to ground (which does not consume additional power). The frequency factor \$f\$ is added to represent an amount of such cycles per second.

In fact, it is surprising that the above equation may be an accurate estimation of dynamic power at all, given the huge amount of non-trivial assumptions made. And indeed, this result may be used for the first order analysis only - any serious discussion of power dissipation in modern CPUs can't rely on such a simplified model.

**How the simple model breaks:**

All the assumptions made while developing the above simplified model break at some point. However, the most delicate assumption which can't hold for an extreme frequencies is that of two mutually exclusive ideal switches.

The real inverter has non-ideal Voltage Transfer Curve (VTC) - a relation between inverter's input and output voltages:

On the above VTC the operational modes of both NMOS and PMOS were marked. We can see that during switching there will be time when both NMOS and PMOS are conducting at the same time. This means that not all the current drawn from the power supply will flow to "output capacitor" - part of the current will flow directly to ground, thus increasing the power consumption:

**What this has to do with frequency:**

When the frequency is relatively low, the switching time of the inverter comprises negligible part of the total operational time:

However, when the frequency is pushed to the limit, the inverter "switches continuously" - it is almost always in switching activity, thus dissipating a lot of power due to direct ground path for the current (time scale changed):

Maybe it is possible to try to model this and see if the result is exponential, but I prefer to use simulations (however, the simulation will account for all non-idealities, not just this one).

**Simulation results:**

In simulation I measured the total energy (integral of power) drawn from an ideal power supply by an inverter in the following configuration:

The first and the last inverters are there just in order to model a real driving and loading conditions.

The dissipated energy as a function of frequency:

We can see an approximately linear dependence for periods longer than 1ns, and clearly exponential dependence for shorter periods.

**Notes:**

- For the simulation I used an antique 0.25um transistor models. The current state of the art transistors are more than x10 shorter - I guess the divergence from the linear model is stronger is newer technologies.
- The question whether a particular CPU/GPU can be overclocked such that it enters the exponential frequency dependence state while still stable and functional is device specific. In fact, it is exactly what overclockers try to derive empirically - to what frequency can a given device be pushed without malfunctioning.
- All the above results and discussions do not consider changing voltage levels. I guess there is no way to analytically predict the outcome of simultaneous change of both frequency and voltage - the only way to find out is to perform an experiment.

**From a single inverter to CPU:**

CPUs mainly consist of logic gates, which are conceptually similar to an inverter. However each modern CPU has sophisticated measures of controlling its operating frequency, operating voltage and can turn off its submodules during runtime. This means that the heat dissipation trend of the whole processor may be slightly different than this of the single inverter. I guess that the statement about exponential increase in heat dissipation during extreme overclocking is a bit of exaggeration, but we are not mathematicians: either it is exponential, or \$\propto f^{3+}\$ - it is all kind of "bad".

## Best Answer

The power delivered to a resistor, all of which it converts to heat, is the voltage accross it times the current thru it:

P = IV

Where P is power, I is current, and V is voltage. The current thru a resistor is related to the voltage accross it and the resistance:

I = V/R

where R is the resistance. With this additional relation, you can rearrange the above equations to make power as a direct function of voltage or current:

P = V

^{2}/RP = I

^{2}RIt so happens that if you stick to units of Volts, Amps, Watts, and Ohms, no additional conversion constants are required.

In your case you have 20 V accross a 1 kΩ resistor:

(20 V)

^{2}/(1 kΩ) = 400 mWThat's how much power the resistor will be dissipating.

The first step to dealing with this is to make sure the resistor is rated for that much power in the first place. Obviously, a "¼ Watt" resistor won't do. The next common size is is "½ Watt", which can take that power in theory with all appropriate conditions met. Read the datasheet carefully to see under what conditions your ½ Watt resistor can actually dissipate a ½ Watt. It might specify that ambient has to be 20 °C or less with a certain amount of ventillation. If this resistor is on a board that is in a box with something else that dissipates power, like a power supply, the ambient temperature could be significantly more than 20 °C. In that case, the "½ Watt" resistor can't really handle ½ Watt, unless perhaps there is air from a fan actively blowing accross its top.

To know how much the resistor's temperature will rise above ambient you will need one more figure, which is the thermal resistance of the resistor to ambient. This will be roughly the same for the same package types, but the true answer is available only from the resistor datasheet.

Let's say just to pick a number (out of thin air, I didn't look anything up, example only) that the resistor with suitable copper pads has a thermal resistance of 200 °C/W. The resistor is dissipating 400 mW, so its temperature rise will be about (400 mW)(200 °C/W) = 80 °C. If it's on a open board on your desk, you can probably figure 25 °C maximum ambient, so the resistor could get to 105 °C. Note that's hot enough to boil water, but most resistors will be fine at this temperature. Just keep your finger away. If this is on a board in a box with a power supply that raises the temperature in the box 30 °C from ambient, then the resistor temp could reach (25 °C) + (30 °C) + (80 °C) = 135 °C. Is that OK? Don't ask me, check the datasheet.