PC serial communication with FPGA


I am designing a simple 16 bit adder circuit in Digilent's Xilinx Spartan 6 FPGA. The Verilog design accepts two 16 bit inputs A and B and returns the 16 bit sum C = A+B. I am ignoring carry in and carry out.

I want to send A and B from PC to FPGA using serial port as well send the sum C from FPGA to PC using serial port. I am not sure how to do this. I have googled but could not find something simple. Can i send A and B as decimals like A = 5, B = 3? or do i have to send it as ASCII? How do i distinguish A from B if A = 255 and B = 512 (i.e., both A and B have multiple digits). How does FPGA deal with ASCII.

Some pointers or explanations will be really appreciated

Best Answer

I would suggest either writing a UART module from scratch or finding one online. Then all you would need to do is write a wrapper that interfaces the UART to your registers. Here is one possible open source Verilog UART module that I wrote a while ago:


This particular module uses the AXI stream interface, so it should not be very difficult to interface with your design.

The AXI stream interface has three signals: tdata, tvalid, and tready. tvalid indicates that tdata contains valid data, and tready indicates that the sink is ready to receive the data. Data bytes are transferred only when tvalid and tready are both high.

What I would recommend doing is defining a simple serial protocol that supports some basic framing. Say, to send A and B to the FPGA, you would send some sort of start indication (e.g. 0 or perhaps ASCII S or W), then the MSB of A, then the LSB of A, then the MSB of B, then the LSB of B. Then you can write a state machine that looks for the start indication, then loads the next four bytes into the appropriate registers. Then once the operation is complete, the state machine can send the result back to the computer. If you want to use ASCII instead of binary, you can do that too, but it's a bit more complicated to convert everything. I would recommend using hex if you want something human readable as it is much easier to divide by 16 than it is to divide by 10 (bit shift/bit slice instead of an actual division operation).

Related Topic