Electronic – Why is “throughput” called “bandwidth”

bandwidthcommunicationdigital-communications

But back in school, I had learned that the "bandwidth" of a communication channel is the frequency of the highest-frequency signal minus that of the lowest-frequency signal that it can carry.

Yet in practice, I see people use the word "bandwidth" when they are referring to the throughput of some digital communication link.

For example, SpeedTest.net boasts that it can test your "internet connection bandwidth", and NVIDIA graphics cards report "memory bandwidth" in GB/s:

I always found it confusing how the term "bandwidth" is used for something that is measured in bytes per second.
Are there any differences between throughput and bandwidth? Is this a correct use of the term, or is it a misuse that has gotten "stuck" over the years? Why/why not?

Best Answer

Bandwidth is as you say, the difference between the upper frequency and lower frequency on a spectrum, usually at the 3dB points where the curve on the graph is 3dB lower than the maximum value.

In digital communications, whilst the data is digital (comprising 1's and 0's), often the digital data is not transmitted directly, but a carrier signal modulated in response to the 1's and 0's. The actual signal transmitted is not digital, but analogue and where either the phase, amplitude or frequency is modulated based on the value of the data bit value.

As an analogue signal is being transmitted then the bandwidth concept applies, taking the analogue time based waveform and representing it in the frequency domain as a spectrum. And the question becomes "how much bandwidth does the digital communications channel occupy?"

If you take the radio spectrum for example, FM broadcast radio band, from 88 to 108MHz, there exists in that spectrum multiple radio stations each operating on different carrier frequencies, a technique known as frequency domain multiplexing. If you then transmit a modulated digital data stream, that data stream will occupy a band of space from left to right on the frequency spectrum, with a minimum frequency and and upper frequency. And you can transmit multiple digital channels of data, with each one centred on a different frequency, and you don't want the bands overlapping, as that will result in interference and corruption of the data when both digital channels transmit at the same time. So when undertaking the design and implementation of digital communication systems you often need to know what the bandwidth is to prevent overlap of the spectrum for each digital comms channel.

Generally speaking, the higher the data rate, the wider that band of frequency space the data communications will use. Higher speed data = greater bandwidth when modulated on to the transmission medium. The two are closely linked and evidently some vendors state bandwidth when it's the data rate or speed they're actually quoting.

There is no simple relationship between data rate expressed in bits/kilo/mega bits per second and the amount of spectrum used expressed in Hertz, it very much depends on the modulation scheme used.

In terms of the transmission of data between components in a computer system, most of the time, there is no modulation of signal by data (except for ethernet, bluetooth, wifi), the transmission of data between a graphics card and motherboard is a simple digital transmission of data achieved by binary data (1's and 0's) sent down tracks on a circuit board, (a databus) and there is no modulation of carriers as there is in radio communication, as it isn't needed, and the word bandwidth isn't strictly the right word to use, but "throughput" is.

The words are often used interchangeably these days.