What's the easiest way to measure the period of time between the "peaks" when a 5V DC signal goes from high to low to high?
I'm trying to debug a power-loss problem, and one of the solutions I'm considering is decoupling the power input by adding a capacitor across the VCC and Gnd pins, but I need to know how long the power drop lasts in order to choose an appropriate capacitance.
I have a Fluke multimeter and a DS202 DSO, but I'm not familiar with using either for this type of measurement, nor can I find anything in their manuals that explicitly tell me how to make this type of measurement.
Best Answer
Normally you'd use an oscilloscope with a suitable bandwidth. For example, an inexpensive 50MHz digital scope would reproduce a 1usec drop fairly well. If that DS thing has a 1MHz bandwidth, then you should be fine if the drop is > 50usec.
Most digital scopes allow you to see what happened just before the trigger, so you should look at your manual to see how to do that. That allows you to trigger a single sweep from the drop in voltage and see the entire event displayed.