Electronic – Importance of baud rate in USB CDC

microcontrollerusb

As I understand it, in PC-to-MCU communications over USB CDC, specifying a baud rate is only necessary when the MCU sends its received data to a UART. The baud rate is needed by the bridging device to know the signalling rate to be used on the TX/RX lines.

For example, the ATMEGA16U2 used with the ATMEGA328 on Arduino Unos is responsible for receiving USB packets from the PC and translating them to UART signals for the ATMEGA328. So I can see why it is important that the ATMEGAU2 and ATMEGA328 agree on a baud rate.

I've read somewhere that the PC ignores any baud rate setting used with USB CDC. And yet, when a wrong baud rate is set in the PC serial terminal, gibberish is received, even though the data is received over USB without any traces of UART stench on it. The only case where I think baud rate should matter on a PC is when data is being transmitted by the PC to the bridge. I imagine baud rate information is sent along with the data so that the bridge can know at what speed to signal the receiving UART.

What is wrong with my assumptions? When and why is baud rate relevant at the PC's end in USB CDC communications?

Best Answer

The ATmega16U2 used in the Arduino Uno is acting as a USB to serial converter, connected to the ATmega328P that you're programming. Setting the baud rate on the USB CDC serial device causes the ATmega16U2 to set its UART to that baud rate, so setting an incorrect baud rate prevents proper communications between the two chips.

If you were using an Arduino based on an ATmega32U4 -- such as the Arduino Leonardo or Arduino Micro -- the USB CDC baud rate would not matter, as the AVR interprets "serial" data directly from the USB stream. (Indeed, you can send data in either direction however fast you want; the baud rate has no effect on transfer rates.)