A friend and I were discussing the possibility of wireless video cables and we quickly realized that the bottleneck is the current data transfer rate of wireless technology. The average wireless router seems to output at roughly 300 Mbps while a standard HDMI cable puts out roughly 10 Gbps. Does radio frequency interference, distance, and material interference (such as walls) really account for this huge difference in max data transfer rate? Or is there other hardware limiting factors?
Electronic – What limits wireless data transfer rates (compared to wired)
communicationdatawireless
Related Solutions
You've just described two separate and entirely valid technologies used in communication theory today: software-defined radio and (for lack of a good general term that I can remember) multi-symbol/level communication.
If we modulate the amplitude of a wave (I think by providing the oscillator different levels of current), can we not sample this wave with some sort of analog to digital converter and process it on the CPU?
Yes - to a degree. You've just described software-defined radio. The basic idea is what you said: dispense with the majority of the radio frequency equipment and create the modulated sine wave directly from the output of a D/A converter and for the return path use a similarly fast A/D and plenty of DSP processing for both sides. The current problem is that although processor speeds are measured in gigahertz nowadays, the interface with the analog world hasn't yet reached those speeds. This means that direct waveform creation is limited to low frequencies (which, for communications, is still fearfully high compared to frequencies 'normal' analog designers worry about). However, if I read my articles correctly this as still allow removal of some of the intermediate-frequency hardware present in most radios. In the future it may be possible to dispense with more of the hardware.
If this is possible, why stick to base 2? If we can have a unique value for each measurable amplitude, data transfer rates would skyrocket. Imagine transferring data with base 1024, or even higher. If we could accurately sample the wave (each oscillation), I don't see why the rate of transfer could be equal to the frequency of the wave times base divided by 2 bits per second (this is probably not correct mathing).
You're right that it's not perfect but you definitely have the basic idea down. To give an example we'll stick with Amplitude Modulation. When you're trying to transmit 0 or 1 using AM it's called On-Off-Keying (link goes to a site with nice pictures and a description). This works by modulating a pure digital signal - 5v is '1', 0v is '0'. You're right that if you have a number of voltage levels you can send more data at once - this is called Amplitude Shift Keying (another nice description with picture). As you can see, there's multiple levels of voltage for various combinations of bits - 2 bits gives four different voltage levels, 3 gives 8, etc.
The problem with this and other similar schemes is not theoretical but practical - in a communication channel with noise it's very likely you'll have trouble figuring out what exactly was sent. It's just like with analog signals: if my only valid voltage levels are 0 and 5V then if I get 4.3V out I can be reasonably sure it should be 5V. If I have 1024 valid voltage levels then it gets a lot harder to determine.
Also note that you're not limited to Amplitude Modulation - the same techniques can be applied to Phase Modulated signals (similar to FM) or you can step into the realm of Frequency Shift Keying where distinct frequencies represent bits (ie, if you want to transmit '3' in binary that might mean sending a 3KHz sine wave and a 6KHz sine wave, then separating them at the receiving end where sending '1' might just be the 3KHz sine wave).
And these techniques are already in wide use - GSM cell phones use a form of Frequency Shift Keying called Gaussian Minimum Shift Keying. Although I do want to correct one incorrect idea you may have: modulation is still used in all of these schemes. The opposite of a modulated signal is a baseband signal (like a bitstream from a serial port). To communicate at any distance over the air you need modulation, period. It's not going away, but how we generate the modulated waveform will change.
I suggest you take a class in Communication Theory if you can - it sounds like you've got the knack for it.
The difference between the two formulas arises from the fact that the Nyquist formula uses the number of encoding levels that was explicitly given (16 levels implies 4 bits/baud), while the Shannon formula is the theoretical maximum based on the SNR of the channel (40 dB implies about 6.64 bits/baud).
3000 Hz × 2 baud/cycle × 4 bits/baud = 24000 bits/sec
3000 Hz × 2 baud/cycle × 6.64 bits/baud = 39840 bits/sec
Best Answer
Physics
Specifically the Shannon Information Theorem, which (loosely) states that the amount of information you can encode in a signal is limited by the bandwidth of the signal. In wireless channels bandwidth is constrained (see below). In wired systems, bandwidth is almost unconstrained (being limited only by manufacturing and cost issues up to 10's of gHz).
A wireless signal is regulated by the government
Government restrictions put severe limitations on how much (contiguous) bandwidth is available for unlicensed commercial use. With licensing (billions of dollars) the situation gets marginally better, but still no where near the free (almost unlimited) bandwidth available in a cable.
Wireless channels are noise/error prone
At a high level, it's like having a conversation across a room with other talkative people in it (wireless), versus holding a cup with a tight string to your ear while your partner speaks into the cup at the other end (wired).
Wireless channels lose information. To counter-act this problem, wireless channels use encoding schemes that enable error detection and (in some cases) recovery. In general, error coding schemes require sparsity (gaps) in the data sequence possibilities so that one valid value isn't undetectably turned into a different valid value. This necessarily hurts throughput (more redundancy = less unique information).
Overcoming errors, noise, and interference takes power
Wireless microchips, like all microchips are limited by the amount of power they can consume in a particular area -- a metric known as power density. Wire-line channels are less noisy, less lossy, and, therefore, way more efficient to transmit and receive into.
All things equal, you could run a wired microchip at much lower power levels for the same bandwidth. Conversely, for the same power level, the wired transceiver could be clocked much faster.