Will a device capable of receiving a 200 Vrms input signal happily accept a 200 VDC signal

acanalogdcrmsvoltage

To the best of my knowledge, Vrms is used to express an AC signal as the equivalent DC signal that would produce the same power dissipated in a resistor R; I believe this is best calculated using a thermistor to measure the heat given off from the resistor.

Now, if an analog input device were to specify that it can accept a 200 Vrms signal, wouldn't it follow that it can happily accept 200 VDC as well since Vpk-pk of the AC signal must be at least as large as 200V to achieve the 200 Vrms rating?

If the above statement is correct, then it follows that the opposite is not true; that is to say a device with an input spec of 200 VDC may not necessarily be happy with a 200 Vrms signal as that AC signal might have peak voltages over 200V which could damage the input circuitry.

Somebody please edify me

Edit

As requested, here is the device in question. It is a National Instruments 9255 C-Series module. Apparently, it actually supports up to 300Vrms but the same question applies. Also, here is the datasheet.

Best Answer

The first part of your logic is often correct. However there are some instances where it is false. For example, supposing the input is a transformer, then it may be quite happy with 200 V AC input, but would burn out if you apply 200 V DC because its impedance at DC (ie: its resistance) is very low. This certainly applies to power transformers, for example those in power supplies (PSUs of the non-switching variety). While there are probably not many devices whose signal input stages are transformers, there certainly are some.

The second part of your logic (" the opposite is NOT true") is sometimes false, sometimes true (ie: your conclusion doesn't follow). Some devices happy with a 200VDC input may be happy with 200VAC input, some not.