There isn't a specific answer to your question, because even among cell phones and Wi-Fi devices, there are different notions of "signal strength". However, I'll describe two methods that are broadly used:
The first is to measure the power of the signal received by the antenna. These numbers are usually reported in dBm. Since this is coming from a controlled impedance antenna and transmission line, it's sufficient to know just the RMS voltage or current. For example, let's say we measure the RMS voltage to be 2mV, and we have a 50 ohm antenna system. Then:
$$ \begin{align}
P &= E_\mathrm{RMS}^2/R \\
&= (0.002\mathrm V)^2 / 50 \Omega \\
&= 8 \cdot 10^{-8} \mathrm W
\end{align}$$
Typically this is converted to decibels relative to 1mW, dBm:
$$ \begin{align}
L_\mathrm{dBm} &= 10 \log_{10} \bigg(\frac{P}{0.001\mathrm{W}}\bigg) \\
&= 10 \log_{10} \bigg(\frac{8 \cdot 10^{-8} \mathrm W}{0.001\mathrm{W}}\bigg) \\
&= 10 \log_{10} (8 \cdot 10^{-5}) \\
&= -40.97
\end{align} $$
Therefore, our signal strength is about -41dBm.
The trouble with this method is that it's actually not measuring the signal, but all electromagnetic energy received by the antenna. That is, it also includes noise. The signal might be "strong", but the noise might also be strong, so the signal quality is poor.
Another way to calculate "signal strength" works only for digital modes is called bit error rate or BER. It is the percentage of bits that were incorrectly received.
It can be calculated a few ways. One is to have the transmitter send a test pattern of bits that the receiver already knows. The receiver then compares what it received against the test pattern and counts the bits that were incorrect. This number, divided by the total number of bits in the test pattern yields the fraction of bit errors. Multiply by 100 to make a percentage, if desired.
It's also possible to calculate BER if the protocol uses forward error correction. The exact algorithm will of course depend on the FEC being used, but usually it's possible to know how many bits are being corrected by the FEC, and thus calculate the BER.
The advantage of BER is that it gives an indication of how well the communication is working, taking into account noise, synchronization errors, etc. It is usually the case that increasing transmit power will increase the signal-to-noise ration, and reduce the BER.
Best Answer
No, your understanding is not correct.
First, just "signal strength" by itself is meaningless. Signal strength where? If you mean at some distant receiver, then yes, frequency is one factor in how strongly a station is received at the same distance and transmitter power. However, there are many such factors and the relationship with frequency is not monotonic. The difference between 93 MHz and 94 MHz will be irrelevant in a practical sense.
Long wavelengths, like are used by commerical AM (around 1 MHz) are long enough that they refract around the earth to some extent. This doesn't really happen at commercial FM frequencies (around 100 MHz). Different wavelengths also get absorbed, passed, or bounce off of layers in the atmosphere. There is much more to this than lower frequencies magically have more "signal strength", whatever that actually means.