Electronic – Could modern electronic chips (e.g. ARM and Intel processors) operate for millennia at sufficiently low temperatures and clock speeds

lifetimesemiconductorstemperature

More broadly, what the practical maximum a modern silicon chip could be expected to last (assuming it operates at some temperature above 77K, and it has stable power)?

For example, I expect diffusion of dopant materials to over time make it lose it's properties. I believe diffusion rates are roughly exponential with temperature, so one might expect an \$e^{77\alpha}\$ as opposed to \$e^{273\alpha}\$, which is roughly \$e^{200\alpha}\$ times slower.


To clarify the applications I have in mind, take the Clock of the Long Now, which is being designed to last over 10,000 years, but made out of massive mechanical components. Another application would be sending interstellar objects at velocities practically achievable today: it would take about 80,000 years for the voyager probe to reach Alpha Centauri. Radiation shielding isn't that much of a theoretical problem since it's a matter of scaling up the shielding mass surrounding the probe (if you can launch and accelerate 1 probe, you can launch and accelerate 1000 shielding probe-masses at 1000x the cost).

Best Answer

I think the broader problem is that the chip is useless in isolation as it requires supporting components in order to make a useful circuit. Many of these components have a failure rate that exceeds that of silicon chips by many orders of magnitude.

In reliability engineering we know the failure rate of a serial system is greater than that of the highest failure rate component of the system. So to focus attention on what is arguably the most reliable component in a circuit is perhaps a priori but it is not germane to making hyper long life, functioning circuits for applications as proposed in the question.