Electronic – USB: what are the advantages (or disadvantages) or using HID over serial-over-USB

hidusbusb device

Disclaimer: I'm a novice at electronics and even more so to working with USB, please bear with me if I misunderstand some of the essentials behind how USB works. Any correction welcome!

The scenario: with a group of friends we are building a robotic sailing boat. We are managing all the on-board sensors through a µC (AVR) but the sailing AI is done on an embedded linux system. The electronics is connected to the the AI hardware via USB, and I am looking for advice on what protocol to use for the transmission.

We already tried both the serial-over-USB and using HID, and both worked good enough™ in our short, near-coast tests but before settling for one or the other, I'd like to know if there is some non-trivial but nevertheless important differences between the two that I missed to consider. For our project the most important characteristics are:

  • Reliability: our robot is due to sail autonomously for a few days. Is either
    of the two protocol inherently "safer" / with better error handling /
    self-recovery functionality?
  • Throughput: although in normal operating situations our throughput could
    well be under 1Kb/s, under certain conditions we will need to harvest data
    nearly real-time. Serial is limited to 115Kb/s, does HID have a speed limit
    other than the 1500Kb/s of the USB 1.0 protocol?

…but as I said: I'm pretty new to electronics/USB, so if you feel I am missing a key parameter, I'll be glad to hear.

Thanks in advance for your time and expertise!

Best Answer

To determine what solution will be best for throughput, you need to find the bottleneck and improve that until it's no longer the bottleneck until you can't remove anything anymore.

  1. Your AVR has a fixed maximum frequency, aim to make its (optimized) processing routines the limiting factor. If your communications protocol can handle this speed, then there's nothing more you can do.
  2. USB at 1.5Mbps represents moving 1 byte every 160 clocks. That's quite a lot of time. to do processing, and if you're triple-buffering your data or sending out full blocks you'll want this to be even faster. At 12 Mbps, you get 13 clock cycles per byte, which is much more difficult to achieve, and would be a reasonable goal.
  3. The interface between USB (and even USB itself) will carry some overhead. Sending ASCII data over an asynchronous serial interface is a big overhead. A parallel FIFO USB interface chip will be much faster. A micro with built-in USB peripheral will be better yet.

To get more reliability, you need to stick with what works and what you know best. Relax the specs. Here, a tried-and-true serial interface is a fine choice.


How does "serial over USB" work in your application?

If it's a software driver on the embedded Linux device, and a USB <-> parallel IC (or software serial port on a USB-enabled AVR, which sounds likely since you've tried HID mode), then you'll have no problems with higher bit rates.

However, true USB <-> serial ICs do exist and are popular (the FTDI232R on older Arduinos comes to mind...) which turn USB into logic-level serial RS232 communication. You don't need or want this layer for your application. It will reduce your throughput and add an additional asynchronous delay section into your application.