Electronic – Why do some Integrated Circuits get so hot

Architectureefficiencyheatmicrochip

I was reading an article about Radeon 6990 recently and it was recommending 1200 Watts of power supply and the entire machine to be kept dipped in mineral oil because air cooling would kill it in a few months if not. My question is why are chips converting so much electricity into heat? What does physics say into this and is it not possible using the very advanced technologies we have today to make our chips a bit more efficient?

Edit1: I am hoping for a technical answer. Apologies if that was not evident in the original question.

Best Answer

Why does any machine get hot when heavily used? Because nothing is 100% efficient. Wires have resistance. Transistors have gate capacitance. Every flow of electrical energy necessarily has some inefficiency. Energy can't be destroyed, so where does this lost electrical energy go? Heat. When you further try to cram as much functionality and speed into the smallest space possible (GPUs being at the leading edge of this endeavor), you get a lot of heat in a small space.

Believe me, engineers are using the most advanced technologies they can to reduce the heat generated by GPUs. The main thing that prevents them from making them faster is that they would burst into flames if they did. Every day, they find ways to make the processors more efficient, so they can cram in more functionality and more speed to take them back to the threshold of self-destruction by heat.

See also Why is the CPU hot when it's turned on?