I'm not asking about manufacturing. I'm asking about designing electronics to survive normal use in the field. I want to figure out just how necessary it is to include TVS diodes in my design.
As I mentioned in my previous question, rarely did anyone bother in the 80s and 90s to include any ESD protection on I/O lines. These devices seem to survive OK.
I imagine it will depend on what type of ICs the I/O lines are connected to. In the 80s and 90s, they were generally NMOS VLSIs, early CMOS VLSIs, CMOS and TTL gates.
Are modern 5V MCUs more vulnerable than 74HC gates, warranting the inclusion of TVS diodes on the I/O pins?
Does the type of connector dictate the degree of ESD protection required? I can see a female D-sub connector being reasonably safe without any ESD protection – unless the cable itself is charged.
If I do need TVS diodes, then do I also need series resistors? I looked at the datasheet for a suitable 5V TVS, it specifies a maximum voltage drop of 24V when shunting a 20 amp ESD spike. If I connect the TVS directly to the I/O pin, the ESD diode inside the IC will conduct. 24V is much greater than the 0.3V ESD diode drop.
I could put a 33 ohm series resistor between the TVS and the I/O pin. This limits the current to less than an amp through the internal ESD diode which it can probably withstand. But is it really necessary? I have a lot of I/O pins and I'd rather avoid the resistor. Can I rely on the ESD diode having a sufficiently high dynamic resistance that the TVS will take the bulk of the discharge?