Electronic – What happens when a program needs to use more than the allotted 16 registers

cpuregister

I'm just learning about how programs work under the hood of a computer. We discussed how registers are a limited resource to the CPU. So what happens when a program is massive and needs more than 16 registers? I've tried looking this up and couldn't find any answers regarding it.

Best Answer

Programs aren’t giant, monolithic things. They’re broken up into smaller routines and tasks. As a result, there isn't a need to have a large number of variables in registers at the CPU's beck and call. It's just not efficient.

Instead, some tricks are used to make better use of the registers. One of the most important ones is the use of a special region set aside in system memory, called stack.

How does stack work?

When a program branches off to perform a sub task, some of the CPU registers will be saved to memory to free them up for the subtask, then restored once more when the subtask is done. This is most often done using stack memory, using special CPU stack opcodes that are designed to be fast and efficient.

How do stack opcodes work?

Stack accessed in a last-in, first-out fashion, by the special stack opcodes that ‘push’ or ‘pop’ groups of registers, called stack frames, onto or off of the stack RAM.

A register, called a stack pointer, is used to keep track of the top of the stack. On some CPUs it’s a special register; on others, any register can be used as a stack pointer.

The stack frame also includes other calling context, such as the program counter, status flags, and other vital information needed to restore the context when the subtask is done.

What implements the stack?

The details of managing registers and manipulating the stack via stack push/pop opcodes are dealt with by the compiler. In broad terms, the compiler looks at the subtask code, determines how many local variables the task needs, then inserts code to save (push) registers to stack memory, freeing them for the task to use. Likewise, when the task is done, another block of inserted code restores (pops) the registers from the stack.

Even within a task, the compiler will examine how variables are used, and determine when a variable is no longer needed and thus free up its register to be reused.

If, on the other hand, if the compiler can’t find any registers free, it will make a decision to punt some variables to memory (push them onto the stack, usually), only pulling them back into registers as they’re needed.

Here's a discussion that explains stacks pretty well. https://stackoverflow.com/questions/10057443/explain-the-concept-of-a-stack-frame-in-a-nutshell

As a programmer, having awareness of how the compiler uses the register, stack and main memory resources can help you to create better, faster code, even if you never look at a line of assembly.

What's in a name, anyway?

One such register-saving operation is called the stack exchange, where a register is swapped with the top of the stack. And, yes, that's where this website got its name.

What happens when you run out of stack RAM? Why, you get a stack overflow, which is, not at all coincidentally, the name of SE's programming-oriented sister site.