Given the goals of the class, I think the TTL approach is fine, and I say this as an "FPGA guy". FPGAs are a sea of logic and you can do all sorts of fun stuff with them, but there's only so much that's humanly possible to do in a semester.
Looking at your syllabus, your class is a mix of the logic design and "machine structures" courses I took in undergrad. (Plus, it's for CS majors. I'm all for CS majors having to face real hardware--letting them get away with writing code seems like a step back.) At this introductory level, where you're going over how assembly instructions are broken down, I see no real benefit to having students do things in code versus by hand. Doing HDL means learning the HDL, learning how to write synthesizable HDL, and learning the IDE. This is a lot more conceptual complexity and re-abstraction. Plus you have to deal with software issues.
Generally the point of a course that uses FPGAs is to practice creating logic that is useful--useful for talking to peripherals, serial comms, RAM, video generators, etc. This is valuable knowledge to have, but it seems very much out of the scope of your course. More advanced classes in computer architecture have students implement sophisticated CPUs in FPGAs, but again, this seems out of the scope of your course.
I would at the very least devote a lecture to FPGAs. Run through a few demos with a dev board and show them the workflow. Since you're at Mills, perhaps you could contact the folks at Berkeley who run CS150/152 and go see how they do things.
The "trick" with JK flip-flops
Since we don't have a previous answer about designing state machines with JK flip-flops...
Designing with JK flip-flops is pretty much the same as design with D flip-flops (or T flip-flops or any other kind of flip-flops.) But with the JK flip-flop the excitation table is such that there is a trick that makes dealing with the state transition logic slightly simpler.
For any state machine/counter design you need the excitation table for your particular flip flop.
For JK flip flops the excitation table looks like
Previous State -> Present State J K
0 -> 0 0 X
0 -> 1 1 X
1 -> 0 X 1
1 -> 1 X 0
The special property we are going to leverage is that when the previous state is 0
then the K
input is a don't care, and when the previous state is 1
then the J
input is always a don't care.
First you have to complete your table. Not only do you have to have the inputs and the Q
s, you also need all the J
and K
entries:
x|y|Q1|Q2|Q3||Q1+|Q2+|Q3+||J1|K1|J2|K2|J3|K3
--------------------------------------------
0|0|0 |0 |0 || 1 | 1 |0 ||1 |X |...
0|0|0 |0 |1 || 0 | 0 |0 ||0 |X |
0|0|0 |1 |0 || 0 | 0 |1 ||0 |X |
0|0|0 |1 |1 || 0 | 1 |0 ||0 |X |
...
Now we need to draw the 6 K-Maps for J1
,K1
,J2
,K2
,J3
, and K3
. If we weren't using our "trick" we would need these to be 5-variable maps (x
,y
,Q1
,Q2
, and Q3
).
But because J1
is don't care whenever Q1
is 1 we only need to do the map for J1
for the case that Q1
is 0. And of course, in that case we know that Q1
is 0, so we can just do a 4 variable map (with input variables x
,y
,Q2
, and Q3
).
Best Answer
You're post is naive, but that is not necessarily a bad thing for two reasons: 1. There are others in your position that can benefit from your question. And 1. Sometimes people with decades of experience need to revisit these sorts of subjects from time to time to refresh their memory of what is important in circuits.
There are some terms that EE's use that are taught in school and in textbooks, but are rarely used professionally. Sequential logic is one of them. The professional term is "state machine". A state machine is essentially the guts of what you think of as sequential logic.
A "state" is simply the current condition of something. The state of a counter is the count value itself. A state of a stoplight is Red, Yellow, or Green.
When you say "memory is the ability to store and retrieve past signals", you are correct-- but nobody talks like that. We say that the state is stored. It is a minor point, but an important one. Storing a past signal implies that you are storing a signal that changes over time. Storing a state is storing the instantaneous value of the state. Take that little bit of knowledge and tuck it away in your brain for later, when it will make sense to you.
For us, there are two basic types of logic circuits: combinatorial and memory. Combinatorial logic is just logic where the outputs are dependent only on the inputs. It is a cluster of gates with no feedback paths (where gates downstream do not feed inputs of upstream gates). Memory is the opposite of combinatorial logic, in that it stores a value or state for use later. Basic building blocks for memory are the flip-flops and latches. Actual RAM can also be used to store state values, but that is more advanced use.
The core of a state machine (or what you are calling sequential logic) consists of a single block of combinatorial logic, and a chunk of memory to store the output of the combinatorial logic. The output of the memory is fed back into the combinatorial logic. If you are designing a counter, then the combinatorial logic might take the input and add 1 to it. The memory will save that +1 value for the next clock.
Usually connected to state machines is another chunk of combinatorial logic and possibly some more memory to handle the outputs of the state machine (different from the state value itself). An example of this would be an extra signal from our counter that goes high every time the counter is equal to 4.
Where this extra combinatorial logic (and maybe more memory) is in relation to the core combinatorial/memory logic is what determines if this is a Mealey or Moore state machine. I bring up the Mealey and Moore terms only because this is another example of something that is only taught in schools and is almost never used professionally.
But with all this talk about "memory", we have a problem. The way this term is used in this discussion is different than how it is normally used. When you say "memory" to most people they think of RAM and ROM. But memory in this context is normally flip-flops and latches. Usually D-Flip-Flops. The DFF's in a counter will hold one word, and only one word. RAM, on the other hand, will store many words at a time. It is hard to tell from your question, but I think that you are confusing RAM with Flip-Flops and Latches.
Now on to your question: If we can make memory with combinational circuits, why are sequential elements so highly regarded as fundamental to memory?
You can make memory with gates, and you can make combinatorial logic with gates. But combinatorial logic is not memory. In fact, the definition of combinatorial logic is "logic without memory". But almost every useful circuit is made from both memory and combinatorial logic.
What I do not understand from your question is what kind of memory are you referring to. But ultimately it doesn't matter because sequential elements is not fundamental to either kind of memory. It is the opposite, in fact. Memory is fundamental to Sequential logic (a.k.a. state machines).
When looking at state machines, sequential logic, synchronous logic, and the like it can be useful to break up the logic into combinatorial logic and flip-flops. Don't break it up in the actual design, but break up how you think of the circuit. This will help you in identifying the parts that matter. It will also help you later on when you have to start thinking about signal timing, clocks, and all of that stuff.
I also advise that you ignore RAM/ROM for now until you understand the rest of this. There is no sense in complicating things at this stage.