I'm trying to design a precision spectrophotometer that only uses three wavelengths of light. The high level design involves using three laser diodes that pump photons of the three wavelengths. This light passes through the sample and is detected by three photodiodes.
I am using the AD9833 DDS chip to enable me to generate sine/square waves as need be. These are then fed to a current generation circuitry that forces currents resembling these waveforms to flow through the diodes.
I'm in a fix deciding what method to use for the detection circuitry(photodiodes) as the signal I'm trying to decode is in ppm/sub-ppm level. I have thought of two schemes:
1) Generate a sinusoidal light pattern of a frequency(say 1kHz). If I use an integrated package like TSL257, I should be able to read out the voltage with a precision ADC, then demodulate the received signal at the transmit frequency. This synchronous mod/de-mod should give me good accuracy/resolution.
2) Generate a square wave light pattern(0 to high) and then use a package like TSL237 to convert the light energy to frequency and use frequency counting to get the accuracy/resolution. With a base frequency of 1Mhz, a ppm of signal would cause a frequency shift of 1Hz while the noise floor is in 0.1Hz. This made me think this method could also work.
Which one is a better way to sense ppm-level signals?