Calculating FSM’s maximum clock frequency

clockfrequencystate-machines

Let's assume we have the Truth Table for our Finite state machine.

How can we determine the maximum clock frequency for the system, under the assumptions that the wire delay is 0.3 ns , flip-flop setup time is 0.2 ns, and gate delays are 0.4 ns for 2-input gates and 0.5 ns for gates with more than 2 inputs.

Firstly I should find boolean function for each output, outputs to real world and outputs for next states to controller I guess.
Then what?

Thanks in advance, btw this is my first question in electronics 🙂

Best Answer

Timing is fully dependant on the implementation. You will need to fully specify all the state equations, transition equations and output equations. Then you go through a minimization process to reduce the equations to a minimal set. After that you can map your standard gate and flip-flop selections to the sets of logic equations. Once you compute the worst case timing condition that will lead directly to the maximum clock frequency. Sounds like you know where to start. There is some work to do before you can start throwing around timing numbers.