Electronic – How is 255 Tbit/s processed in optical fiber communication

clock-speedcommunicationoptical-fibrephotonicsram

I have never understood how the new record breaking data transfer speeds are achieved in terms of converting from/to electrical and optical signals.

Suppose we have 255 Tbits of data and we want to transfer it in one second. (This is a real world achievement.) You got 255 Tbits stored in, let's say, 255 trillion capacitors (that's RAM). Now we are expected to be able to read each one successively, inquiring each bit so that one second later we have read all 255 trillion of them. This is obviously not orchestrated by a 3 GHz processor.

What about the receiving end? Pulses are coming at 255 THz, yet the refresh rate of electronics trying to read an incoming signal is by far not 255 THz. The only thing I can imagine is thousands of processors with their clock signals time division multiplexed (delayed) by less than 0.000000000001 secs. Although how to achieve such multiplexing also kind of brings me back to my problem with this thousandfold difference in frequencies.

Best Answer

Rather than worrying about a research paper that's pushing things to the limit first start by understanding the stuff sitting in front of you.

How does an SATA 3 hard drive in a home computer put 6 Gbits/s down a serial link? The main processor isn't 6 GHz and the one in the hard drive certainly isn't so by your logic it shouldn't be possible.

The answer is that the processors aren't sitting there putting one bit out at a time, there is dedicated hardware called a SERDES (serializer / deserializer) that converts a lower speed parallel data stream into a high speed serial one and then back again at the other end. If that works in blocks of 32 bits then the rate is under 200 MHz. And that data is then handled by a DMA system that automatically moves the data between the SERDES and memory without the processor getting involved. All the processor has to do is instruct the DMA controller where the data is, how much to send and where to put any reply. After that the processor can go off and do something else, the DMA controller will interrupt once it's finished the job.

And if the CPU is spending most of its time idle it could use that time to start a second DMA & SERDES running on a second transfer. In fact one CPU could run quite a few of those transfers in parallel giving you quite a healthy data rate.

OK this is electrical rather than optical and it's 50,000 times slower than the system you asked about but the same basic concepts apply. The processor only ever deals with the data in large chunks, dedicated hardware deals with it in smaller pieces and only some very specialized hardware deals with it 1 bit at a time. You then put a lot of those links in parallel.


One late addition to this that is hinted at in the other answers but isn't explicitly explained anywhere is the difference between bit rate and baud rate. Bit rate is the rate at which data is transmitted, baud rate is the rate at which symbols are transmitted. On a lot of systems the symbols transmitted at binary bits and so the two numbers are effectively the same which is why there can be a lot of confusion between the two.

However on some systems a multi-bit encoding system is used. If instead of sending 0 V or 3 V down the wire each clock period you send 0 V, 1 V, 2 V or 3 V for each clock then your symbol rate is the same, 1 symbol per clock. But each symbol has 4 possible states and so can hold 2 bits of data. This means that your bit rate has doubled without increasing the clock rate.

No real world systems that I'm aware of use such a simple voltage level style multi-bit symbol, the maths behind real world systems can get very nasty, but the basic principal remains the same; if you have more than two possible states then you can get more bits per clock. Ethernet and ADSL are the two most common electrical systems that use this type of encoding as does just about any modern radio system. As @alex.forencich said in his excellent answer the system you asked about used 32-QAM (Quadrature amplitude modulation) signal format, 32 different possible symbols meaning 5 bits per symbol transmitted.

Related Topic