I was reading some articles about digital and analog modulating processes. One of them was this:
At a certain point is says:
The bandwidth produced is a function of the highest modulating frequency including harmonics and the modulation index, which is:
m = Δf(T)
Δf is the frequency deviation or shift between the mark and space frequencies, or:
Δf = fs – fm
T is the bit time interval of the data or the reciprocal of the data rate (1/bit/s).
I'm a bit confused with these terms. How exaclty the bandwidth relates do data rate? Let's take this image by example:
From what I've understood, if I want to modulate a digital signal by FSK process, I should just choose 2 different frequencies to represent 0 and 1. I should also choose two frequencies that are harmonics of a fundamental frequency to get "smooth" transitions (zero cross), is that correct?
Besides that, how my data rate is related to this bandwidth? Isn't bandwidth the width of a channel?
Let's say I'm using a 2.4GHz transmission with a channel that goes from 2.4GHz until 2.450GHz, then I will have a channel of 50MHz, correct?
My bandwidth in this case isn't 50MHz? If so, my bandwidth should only be expressed by Δf, no? In other words, since I only need 2 differente frequencies, my channel width would be only the difference between those frequencies. With that said, I cannot see where data rates comes in. I think the only thing that would influence on that rate would be how long each bit is holded.
As I said I'm confused with these conecpts. So if someone please could clarify my ideas.