Electronic – Oscilloscope time base

androidoscilloscopepic

I am working on a personal project to make an remote oscilloscope using a PIC32 with wifi module for Android phones.

So far I have did a lot of things, like connecting the wifi module of the PIC32 to android also developing the base software for android.

I am now at a step to get ADC values from PIC32 and display them on the android. I am not sure how can I make this to work like a real scope.

For example, in the software I have set up a timer that depnding on user choice, can read the ADC value from the pic in time intervals of say 1 , 0.5, 0.1 and 0.01 seconds. So lets say the user has selected a TimeBase of 1 second, then the program will request the ADC value on 1 second intervals and connects this data points on the graph to form a wave signal. Is this same as what happening in real scopes?

I don't have a function generator or a real scope in my hands to check this out!

This is the user interface of the software I wrote for android:
enter image description here

So basically my question is, how should I treat the reading process of values out of the PIC32? Should I just stick to the timer in Java, or I should read as much as values possible and draw them on the screen? How should I handle this?!

Best Answer

A oscilloscope plots voltage as a function of time, so your display is reasonable as you show it. However, the term "time base" is meaningless to indicate the X axis scale. What you want is "s/div" (or ms/div or µs/div). This is independent of the sample rate, although there is little point using more than a few pixels per sample.

The sample times you mention are very slow for ordinary oscilloscopes. Some signals will be reasonably visible at those rates, but most things you encounter will not be.

I would probably figure out what the fastest sample rate is that you can support, then always sample at that rate. If the application indicates it does not need samples that fast, then you can merge multiple samples into one before sending over the network. In that case you don't want to do traditional decimation, which seeks to eliminate frequencies that alias. Instead, for each data point send the min and max A/D samples covered by that data point. Each data point should then be shown to vertically cover that min/max range. If the user selects a slow sample rate and a faster signal is being sampled but it is still within the capability of the A/D and the underlying fast sample rate, then the display will be a horizontal bar with vertical width showing the signal peaks. That is a much better display than something that aliases.