Electronic – Sequential Logic – Primarily For Signal Storage

digital-logicmemory

New here 😀

I've been reading David M. & Sarah Harris's book "Digital Design and Computer Architecture" and came to wonder about the role of sequential logic. To me, it seems that the significance of sequential logic is primarily in its ability to store signals (HIGH/LOW, ON/OFF, 0/1, etc.). However, I've also noticed that the characteristic property of sequential logic is its dependence on past inputs in generating outputs.

Now as a rookie EE student, this confuses me. As I understand it, memory (in the electronics sense) is just the ability to store and retrieve past signals. With this in mind, signal storage doesn't seem like a hard task at all; ROM, for example, is (from what I've heard) purely combinational and stores signals – ie. memory. If we can make memory with combinational circuits, why are sequential elements so highly regarded as fundamental to memory? There must be a reason for this that I haven't read about yet (probably something with computer organization I'm assuming?).

I'm sorry if this post seems extremely naive and/or presumptuous :[ I haven't been taught formally on sequential logic :/

Best Answer

You're post is naive, but that is not necessarily a bad thing for two reasons: 1. There are others in your position that can benefit from your question. And 1. Sometimes people with decades of experience need to revisit these sorts of subjects from time to time to refresh their memory of what is important in circuits.

There are some terms that EE's use that are taught in school and in textbooks, but are rarely used professionally. Sequential logic is one of them. The professional term is "state machine". A state machine is essentially the guts of what you think of as sequential logic.

A "state" is simply the current condition of something. The state of a counter is the count value itself. A state of a stoplight is Red, Yellow, or Green.

When you say "memory is the ability to store and retrieve past signals", you are correct-- but nobody talks like that. We say that the state is stored. It is a minor point, but an important one. Storing a past signal implies that you are storing a signal that changes over time. Storing a state is storing the instantaneous value of the state. Take that little bit of knowledge and tuck it away in your brain for later, when it will make sense to you.

For us, there are two basic types of logic circuits: combinatorial and memory. Combinatorial logic is just logic where the outputs are dependent only on the inputs. It is a cluster of gates with no feedback paths (where gates downstream do not feed inputs of upstream gates). Memory is the opposite of combinatorial logic, in that it stores a value or state for use later. Basic building blocks for memory are the flip-flops and latches. Actual RAM can also be used to store state values, but that is more advanced use.

The core of a state machine (or what you are calling sequential logic) consists of a single block of combinatorial logic, and a chunk of memory to store the output of the combinatorial logic. The output of the memory is fed back into the combinatorial logic. If you are designing a counter, then the combinatorial logic might take the input and add 1 to it. The memory will save that +1 value for the next clock.

Usually connected to state machines is another chunk of combinatorial logic and possibly some more memory to handle the outputs of the state machine (different from the state value itself). An example of this would be an extra signal from our counter that goes high every time the counter is equal to 4.

Where this extra combinatorial logic (and maybe more memory) is in relation to the core combinatorial/memory logic is what determines if this is a Mealey or Moore state machine. I bring up the Mealey and Moore terms only because this is another example of something that is only taught in schools and is almost never used professionally.

But with all this talk about "memory", we have a problem. The way this term is used in this discussion is different than how it is normally used. When you say "memory" to most people they think of RAM and ROM. But memory in this context is normally flip-flops and latches. Usually D-Flip-Flops. The DFF's in a counter will hold one word, and only one word. RAM, on the other hand, will store many words at a time. It is hard to tell from your question, but I think that you are confusing RAM with Flip-Flops and Latches.

Now on to your question: If we can make memory with combinational circuits, why are sequential elements so highly regarded as fundamental to memory?

You can make memory with gates, and you can make combinatorial logic with gates. But combinatorial logic is not memory. In fact, the definition of combinatorial logic is "logic without memory". But almost every useful circuit is made from both memory and combinatorial logic.

What I do not understand from your question is what kind of memory are you referring to. But ultimately it doesn't matter because sequential elements is not fundamental to either kind of memory. It is the opposite, in fact. Memory is fundamental to Sequential logic (a.k.a. state machines).

When looking at state machines, sequential logic, synchronous logic, and the like it can be useful to break up the logic into combinatorial logic and flip-flops. Don't break it up in the actual design, but break up how you think of the circuit. This will help you in identifying the parts that matter. It will also help you later on when you have to start thinking about signal timing, clocks, and all of that stuff.

I also advise that you ignore RAM/ROM for now until you understand the rest of this. There is no sense in complicating things at this stage.