USB has been around for quite a long time now, used extensively even in the automotive, marine and space industries to some extent. However certain communication ways are more reliable than others in terms of:
Possibility of hot redundancy
Protection of the connected equipment against failures in other connections (surges, short circuits, misbehaviours [data, voltage levels, timing…]…) or in the hub/switch itself
Large mean time before failure ("freeze" or hardware failure)
Usually, the error rate is a bit less important as detection and correction (which may require retransmission) is handled by the protocol library/stack.
I would like to know how reliable is a system based on a USB hub (say, 16 ports, be it daisy chained or not). What can commonly go wrong, and what could be done (protections/topologies…) to prevent single point failures from spreading (ideally I would just swap the faulty equipment and it would be ready to go).
For a specific piece of equipment at work I need to interface 16 devices to both a primary PC and a redundant PC (taking over if the other one crashes). 75% of the devices have both Ethernet and USB and 25% are only USB, and I'm wondering if I should simplify everything (which may increase reliability as well) by using USB alone or have both to maximise reliability. For USB, switching between the 2 computers would be done using a manual switch.
To illustrate, the hub I'm considering is this one, handling < 350W surges but doesn't seem isolated; I have the hunch they're overusing the term "industrial" so I may switch it for several daisy chained 7 ports. The connected equipment is a bunch of industrial PID temperature controllers, a UPS and a precision thermometer. However this question is more of a general one.