I teach the one and only computer architecture course at a liberal arts college. The course is required for the computer science major and minor. We do not have computer engineering, electrical engineering, other hardware courses, etc. My primary goal in the course is for students to understand all the way down to the gate level how computers work, which I believe they learn best through a hardware lab and not just through a textbook (Computer Organization and Design by Hennessy and Patterson). My secondary goal is to excite them about computer architecture and increase their excitement about computer science. Preparing them directly for industry is not a goal, although motivating them to study more computer architecture is. The students have generally not had any experience building anything or taking a college-level lab course. Typically, 10-15 students take the course per semester.
I have been teaching the course since 1998 in a manner similar to how I was taught computer architecture and digital electronics back in the late 1980s at MIT: using DIP TTL chips on powered breadboards. On the first hardware lab assignment, students build a full adder. About halfway through the semester, they start building a simple computer with an 8-bit instruction set. To reduce wiring, I provide them with a PCB with some of the electronics (two D flip-flops, two 4-bit LS 181 ALUs wired together to act as an 8-bit ALU, and a tri-state buffer). On the first of these labs, they derive the (very simple) control signals for the two instruction formats and build the circuit, entering instructions on switches and reading results from lights. On the second of the labs, they add a program counter (2 LS163s) and an EPROM (which my original question was about, before it switched to how I should teach intro architecture). On the final lab, they add a conditional branch instruction. While the students spend a fair amount of time wiring and debugging, I feel that's where much of the learning takes place, and students leave with a real sense of accomplishment.
People on this forum have been telling me, though, that I should switch to FPGAs, which I haven't worked with before. I'm a software engineer, not a computer engineer, and have now been out of school for a while, but I am capable of learning. I wouldn't be able to get much money (maybe a few thousand dollars) for replacing our existing digital trainers. We do have a single logic analyzer.
Given my goals and constraints, would you EEs recommend that I stick to my current approach of switch to one based on FPGAs? If the latter, can you give me any pointers to materials with which to educate myself?
As requested, here is a link to the syllabus and lab assignments.
Addition: Yes, it is a digital logic course too. When I got to my college, students were required to take one semester of each of computer architecture and digital logic, and I combined them into a single semester. Of course, that's a statement about the past, not the future.