Please correct me where wrong and I would really appreciate intuitive explanation for the following fundamental questions. These may appear trivial but as I am not from Electrical background, these concepts are quite baffling for me.
-
It is preferred to have a low BER. Does the theoretical BER serves as the lower or the upper bound? If the simulated / emperical bit error rate for an uncoded AWGN channel, no pulse-shaping and no spreading techniques, after blind equalization is lower than the theoretical BER – what can be inferred? Basically, I am comparing equalization algorithm in literature and have encountered several variants of Constant Modulus Algorithm. Each paper claims to perform better than the other eventhough the improvement is negligible when compared to the complex modification done to the algorithm. So, this Question stems from the doubt that is when can we say that the equalizer performance is good? When the BER reaches the analytical BER or if it is even lower than the analytical BER?
- What should be the ideal trend of BER at low and high signal-to-noise ratio (SNR)? I have found that the performance of equalizers is good at low SNR and worsens for higher SNR. In general, we seek to develop techniques that can do channel estimation or system identification at low SNR. But, in doing so that technique must also perform good at high SNR because the effect of measurement noise is negligible. What is the tradeoff?
Thank you, and please let me know if any modifications/ improvements need to be made to make my Question clear, if otherwise.
Best Answer
A lower BER would seem to be better, who wants corrupt data after all?
However, it's like the bankers' (apparently) mantra that 'if you are not cheating, you're not trying hard enough'. If in a working comms system you have a BER rate much lower than your correction can handle, you are operating the system with too much power, or too few users per cell, in other words with an inefficient use of resources. The power control algorithm of most modern comms systems reduces the output power (in the mobile to save battery life, in the base station to reduce interference to other users) when the receiver reports back an unnecessarily good BER.
Error correction/detection systems allow you correct small numbers of errors and detect and resend blocks with large numbers of errors. This enables the system to achieve effectively zero error when appropriate (file transfer) and tolerable error when it doesn't need to (speech, video).
Now to your actual question.
If your measured BER is better than your theoretical? If 'theoretical' means 'best possible at this SNR', then someone is not comparing like with like, they've modeled the best case wrongly, or tested a subtly different system. If it means 'target level for economic operation of this system', then it could easily measure better.
Low SNR is where systems have a BER problem, and where the equaliser earns its salt. As the system is not striving for zero BER, it may be acceptable for a worse BER performance due to the equalizer at high SNR, as long as it still meets the system acceptable BER level.