Electronic – How does a non-FPGA (ie a PC with a CPU, RAM, hard drive) mimic logic gates


I know that an FPGA uses look-up tables (LUTs) to synthesize logic gates. A LUT is a block of RAM that is indexed by a number of inputs. The output is the value stored at that memory address. The magic is that the LUT can be programmed to display whatever you want for a particular input. So, a LUT can be programmed with the truth table of any logic gate in order to mimic it! That's how an FPGA synthesizes the logic gates that you specify in your HDL code.

I was thinking the other day, how does a normal computer mimic logic gates? As far as I know (which is not far), if I write a program in C++, first it must be compiled to machine code so that the CPU can read it. Then when I press "run", the machine code goes to memory to await processing by the CPU. I'm not real clear on what happens next, but at some stage the CPU must have execute the logical operations that my program contains, right? Unlike an FGPA, the CPU can't just synthesize whatever resources it needs. So how does it execute the program?

My Guesses:

  1. The CPU has a number of pre-built logic gates. When it encounters an
    AND statement in the C++ code it's executing, it uses one of its AND
    gates. If it sees an OR statement, it uses one of its OR gates; if it
    sees an IF statement, it uses one of its IF gates; etc.

  2. Or, logic is implemented in memory in some way similar to a LUT. This makes more sense to me since it doesn't rely on a limited
    number of gate resources. If my program requires tons of OR logic
    for instance, the CPU won't get bottlenecked by a lack of OR gates.

So, how far off am I?

Edit: Thanks for the answers everyone, I learned quite a bit about CPUs and ALUs. Also, the "IF gate" in my first guess is a typo, that should be "OR gate" (although it's just an example, any logic gate would do). Sorry about that confusion.

Best Answer

Actually your first guess is not as afar off as some are claiming.

A CPU is built around something called an "Arithmetic Logic Unit" (ALU) and a simplistic implementation of that is to have the logic gates implementing all basic operations wired up to the inputs in parallel. All of the possible elementary computations are thus performed in parallel, with the output of the actually desired one selected by a multiplexor.

In an extremely simple (chalk-board-model) CPU, a few bits of the currently executing instruction opcode are wired to that multiplexor to tell it which logic function result to use. (The other, undesired results are simply wasted)

The actual technology used to implement the computations in the ALU varies - it could be "real" logic gates, or it could be LUT's if the CPU is implemented inside an LUT-based FPGA (one very good way to understand the essentials of stored-program computing is to design a simple processor and build it in a logic simulator and perhaps then an FPGA).