Sounds like a cool project.
Many circuits can be designed entirely out of pure digital logic (things that can be implemented out of ideal NAND gates),
such as most of the components of a CPU design.
Some of my favorite digital chips are the 4:1 mux 74AC153 and the 2:1 mux 74HC157 (multiplexers are the tactical Nuke of Logic Design), and the 74HC595 shift register.
If you want people to be able to simulate complete CPUs with your software,
there are only a few remaining parts of a CPU that cannot be built out of ideal NAND gates:
- three-state logic for interfacing to the bidirectional "data" lines of a typical RAM chip. Perhaps one tri-state buffer and one tri-state latch out of 74HCT244, 74HCT245, 74AC253, 74HCT374, 74HC541, 74HC573 (an improved 74HCT373), and 74HC574.
- open-collector logic for interfacing to the bidirectional "data" lines of I2C Flash EEPROM chips
- an oscillator
- some sort of input (pushbuttons and switches; perhaps something that can simulate a hexadecimal keypad)
- some sort of output (LEDs, perhaps pre-arranged in 7-segment displays; perhaps a simulated HD44780 LCD controller; perhaps a simulated bitmap display as used in N&S Building a Modern Computer from First Principles).
"Hierarchical design" is nice. It would be nice if your users could place a box labeled "32-bit adder" on one schematic as if it were a single chip, then open up that adder to build its implementation as several "74181" ALU chips, then open up that ALU to build its implementation as several simpler chips.
Perhaps you could take the extreme N&S approach and build everything out of ideal NAND gates (except for the above-mentioned things that cannot be built out of ideal NAND gates).
If your users want some other 74xxx digital chip, they could build it themselves out of NAND gates and three-state buffers and open-collector buffers.
It would be nice if your simulator could warn people about circuits that might produce glitches or unexpected state transitions.
In other words, instead of simply assuming one particular input-to-output delay,
I wish the simulator checks to make sure the design is robust enough to handle the full range of possible input-to-output gate delays and output-to-input transmission line delays (zero delay to max delay) -- even if today we use a fast chip here and a slow chip there, and tomorrow we use a slow chip here and a fast chip there.
That seems far more useful to me than precisely simulating the exact delays of one particular chip at one particular temperature.
I think sticking with ideal digital logic, with the above exceptions -- temporarily ignoring power dissipation, PCB design, etc. -- will already be plenty useful -- enough to do high-level design of entire CPUs.
Here is a better way: Don't!
The best way to document the design is by using the original code (VHDL, or Verilog). This is the most accurate, and will always be up to date. For simple designs, schematics might be more readable but that is not true for medium to large FPGA designs. At that size, schematics become hard to follow, large, cumbersome, difficult to modify/update, and are impossible to debug. So just don't use them.
When I have to document an FPGA design with schematics, I do it with Visio and make it more of a high level block diagram than schematics. Doing anything with more detail is a frustrating and fruitless task.
I would also argue that VHDL/Verilog are not difficult to follow. If someone can't follow them then the problem is the person, not the code. I have some FPGA designs that if printed out would take up about 800 pages of standard paper. That same design, if shown using schematics, would require 2,000+ large pages. It is easier to get proficient at reading VHDL/Verilog than to transcribe 800 pages of VHDL code into schematics (and then keep it up to date and accurate).
Best Answer
This kind of analyzers is a 40-years-old technology, see publications of HP in 1977-s. It was suitable for relatively primitive digital logic. Today this is replaced by JTAG chain scan, by BIST (built-in self test, not "Behavior Intervention Support Team"), then by ICE (in-circuit emulators), and finally with real-time full-blown on-chip logic analyzers and VISA debug interfaces. Or a combination of all the above. There are some more advanced debug tools in the works.