Bit error rate calculations for a photodiode receiver driven from a laser

communicationdataphotodiode

I'm having problems understanding how the following specification for a photo-diode (with built-in amplifier) rationalizes the bit error rate: –

enter image description here

Circled in red are two important figures. One is NEP (noise equivalent power) and the other is minimum receivable sensitivity. The way I figure it works is like this but please correct me if I'm wrong: –

NEP is typically 310nW and note 3 says this is for a single-ended measurement and therefore I assume that for the diff outputs of the device, the power is doubled to 620nW. From this I conclude that the noise voltage out is sqrt(50 x power) = 5.6mVrms.

Next is the minimum receivable sensitivity of -25.5dBm. Laser jargon talks about a 10dB extinction ratio which means that the logic levels transferred via light are 10dB apart so I naturally assume that -25.5dBm is for logic 0 (dimmer) and -15.5dBm is for logic 1 (brighter). These, in mW terms are simply 0.0028mW and 0.028mW.

Now, if I look at photo sensitivity, this tells me how to convert these to voltages. The figure quoted is single-ended and therefore light powers convert to 4.2mV and 42mV respectively. These double up because the output is differential to a peak to peak output of 76mVp-p.

So now, I have rms noise (5.6mV) and p-p signal amplitude (76mV)

Here is where I may also be going wrong. I can convert the noise to a p-p value by assuming it is Gaussian and, for instance if I used 6.6sigma (99.9% "coverage"), the rms voltage becomes 37mVp-p. This is approximately 50% of the signal p-p amplitude and as far as I can tell will be high enough in amplitude to produce bit errors.

Any smaller than 50%p-p and it isn't quite enough to do bit-damage. I used 6.6sigma and this means the noise signal is below 37mVp-p for 99.9% of the time. For those occasions it is greater than 37mVp-p, 50% of those times the amplitude of the symbol may be enhanced and 50% of those occasions the symbol will be destroyed. I therefore conclude that the BER rate is 1 in every 2000 bits.

However the spec (in the picture) says BER = 10^-10 and this is totally at odds with my calculation.

My calculations suggest a bit error rate of 1 in every 2000 and the spec suggests 1 in every billion.

Where am I going wrong?

Best Answer

The one thing that jumps out as odd, is that photo sensitivity in V/mw which tells me that you can't just use P=V^2/R to get from noise power to noise voltage.

Since it also talks about "noise equivalent power" that tells me that they are referring the output noise voltage back to an equivalent optical power (presumably to cater for different load impedances or something) or maybe that's just the most convenient form of specification for comparing different optical sensors. However, that suggests you should apply the "photo sensitivity" to the noise power to generate your noise voltage.

Which makes the rms noise 1.5*310e-6 or more like 465uv.

Does the BER make more sense if you do that?