It is commonly said that a higher channel frequency implies a higher data rate. For example, in https://www.howtogeek.com/222249/whats-the-difference-between-2.4-ghz-and-5-ghz-wi-fi-and-which-should-you-use/ they say that 5 GHz wifi connection is simply faster than 2.4 GHz. However, Shannon–Hartley theorem states that the maximum channel capacity in bits/s depends only on the channel bandwidth and the SNR. Therefore, the data rate shouldn't really depend on the center frequency of the channel, only on the available bandwidth. Is it a misconception that higher frequency channels support a higher bit rate or is it true for some reason?
Electrical – Does data rate increase with channel frequency
communicationdatafrequencyRFwifi
Related Solutions
I doubt if I can cover all your questions, but I'll give it a try:
Well, what if I'm using a fixed-frequency signal? Fupper and Flower would be the same value, right? So does that mean B=0? So a fixed frequency signal can't carry any data? So what am I missing?
A single frequency signal would be a continuous tone. It's amplitude would never change. It would just continue on repetitively forever. As such, it would not convey any information.
When you start modulating your carrier, the spectrum of your signal is no longer a single frequency. According to the amplitude modulation formula, the spectrum of the modulated signal is the convolution of the carrier (a single frequency) and the modulating signal (typically, containing energy in some band about 0 Hz).
Therefore the modulated output signal contains energy in a band around the carrier, not just at the single (carrier) frequency.
We know that's not true, AM radio does it.
Each AM station delivers energy not just at the carrier frequency, but in a band around that frequency. An AM radio broadcast is not an example of a single-frequency signal.
It's plainly obvious that I can cram way more bits into 2.4*10^9 cycles/second than I can with just 1/sec.
Certainly you could. However, if you simply modulated your 2.4 GHz carrier with an information signal spanning 2.4 GHz, the bandwidth of the resulting signal would be nearly 2.4 GHz. The energy in the signal would be spread from 1.2 to 3.6 GHz.
There is a way to get around this though...
What about fractional differences? Waveforms are analog in nature, so we could have a 1Hz signal and a 1.5Hz signal. Likewise at the high frequency range. Say 2.4GHz minus 0.5Hz. There is an infinite amount of space between 1 and 1.5. Could not 1Hz and 1.001Hz serve as two separate channels?
They can, but only by trading off the SNR term in the Shannon-Hartley formula for the bandwidth term. That is, the formula shows there's two ways to increase the capacity of the signal: Increase the bandwidth or increase the signal to noise ratio.
So if you had an infinitely high signal to noise ratio, you could use 0.001 Hz of bandwidth to carry as much information as you like.
But in practice, the log function around the SNR means that there are diminishing returns for increasing SNR. Beyond a certain point, large increases in SNR provide little improvement in channel capacity.
Two typical ways this is used:
In multilevel AM coding, instead of just sending the carrier or not sending it in a bit interval, you might have 4 different amplitude levels that can be sent. This allows two bits of information to be encoded in each bit interval and increases the bits per Hz by a factor of two. But it requires a higher SNR to be able to consistently distinguish between the different levels.
In FM radio broadcasting, the broadcast signal bandwidth is broader than the audio signal being carried. This allows the signal to be received accurately even in low SNR conditions.
Could not 1Hz and 1.001Hz serve as two separate channels? In terms of practicality I realize this would be difficult, nearly impossible to measure this difference with modern electronics
In fact it's quite easy to distinguish 1 Hz from 1.001 Hz with modern electronics. You simply need to measure the signal for a few thousand seconds and count the number of cycles.
So in that sense, shouldn't there be an infinite amount of bandwidth between two frequencies?
No. Between 1.00 Hz and 1.01 Hz there is exactly 0.01 Hz of bandwidth. It doesn't need to be counted in whole numbers of Hertz, but there's only as much bandwidth between two frequencies as the difference between those frequencies.
Edit
From what you're saying, the B in the Shannon equation has nothing to do with carrier frequency? This is modulation bandwidth only?
Essentially yes. B is the bandwidth, or the range of frequencies over which the signal spectrum has energy.
You could use a 1 MHz band around 10 MHz, or a 1 MHz band around 30 GHz, and the channel capacity would be the same (given the same SNR).
However in the simplest cases, like dual-sideband AM, the carrier tends to sit in the middle of the signal band. So if you have a 1 kHz carrier, with dual-sideband AM, you can only hope to use the bandwidth from 0 to 2 kHz.
Single-sideband obviously doesn't follow this rule.
An information signal spanning 2.4GHz, what does this mean?
I mean that the spectrum contains energy over a 2.4 GHz band.
If you had a narrow band filter and an RF power detector, you could detect energy in the signal at any frequency within the band.
are you taking about the carrier wave now?
No. The carrier is a single frequency. The complete signal contains energy over a band of frequencies around the carrier. (Again, single-sideband pushes all the signal to one side of the carrier; also, suppressed-carrier AM eliminates most of the energy at the carrier frequency)
As N->0, C will approach infinity. So in theory an infinite amount of data can be encoded into a single wave?
In principle, yes, by (for example) varying the amplitude in infinitely small steps and infinitely slowly.
In practice, the SNR term has that log function around it, so there are diminishing returns for increasing SNR, and also there are fundamental physical reasons that the noise never goes to 0.
You can google that exact question to find several variations of this answer:
A noiseless channel can carry an arbitrary large amount of information, no matter how often it is sampled.
Just send a lot of data per sample.
For 4KHz channel, make 1000 samples/sec. If each sample is 16 bits, the channel can send 16 Kbps.
If each sample is 1024 bits, the channel can send 1000 samples/sec * 1024 bits = 1024 Mbps.
The key word here is “noiseless”. With a normal 4 KHz channel, Shannon limit would not allow this.
For the 4 KHz channel we can make 8000 samples/sec. In this case if each sample is 1024 bits this channel can send 8.2 Mbps.
Related Topic
- Frequency allocation for telecommunication companies
- Electronic – Link Capacity and the Shannon-Hartley Theorem
- Electronic – parameter that limits data rate in communication
- Does the noisy-channel coding theorem (Shannon’s theorem) apply to two-way communication with retransmission requests
- Electronic – Does increasing the amplitude of a signal also increase the data rate
- Electrical – Use of frequencies in the WiFi channel apart from center frequency
- Electronic – Derivation of the Shannon Limit
Best Answer
Is it a misconception that higher frequency channels support a higher bit rate or is it true for some reason?
It is not the (carrier) frequency itself that determines the supported bit rates but the available Bandwidth of the channel
Suppose I have 10 MHz of bandwith available at 100 MHz, for example 100 MHz to 110 MHz
or
I have 10 MHz of bandwith available at 1000 MHz, for example 1000 MHz to 1010 MHz
Then the highest bit rate I can achieve will be the same as there is 1 MHz available in both cases.
However note how at 100 MHz that channel bandwith is 10% of the carrier frequency but at 1000 MHz it is only 1 %.
At that 1000 MHz I could fit 10 of those 10 MHz channels to come to the same 10% (or use a wider channel of 100 MHz.)
If we want divide a certain frequency band between service providers, that's much easier to do at the higher frequencies. It can be done also at lower frequencies but that would result in narrow channels (small bandwidth) and therefore lower bit rates.
To put it more simply: there's more "space" at higher frequencies so it "costs less space" to implement higher bit rates.
Suppose I go to 10 GHz, that will give me another factor 10 more space.
The 2.5 GHz Wifi vs 5 GHz Wifi isn't completely fair because the 2.5 GHz band is about 100 MHz wide (that's all channels together) but the 5 GHz band has about 900 MHz available (depending on your country it might not be one continuous 900 MHz range though). See here. So there's simply lots more space (bandwidth) assigned to the 5 GHz standard.