Electronic – Express bytes over cat5e as energy

energy

I'm interested in this purely from personal interest, I'm not a student.

Can I express 1024 bytes of information travelling over cat5e as joules (or some measure that fits into energy of e=mc^2 )? How?

Would this be radically different if it were travelling through a CPU?

I have no practical application in mind, but ideally we could express the mass of a gigabyte, and contrast it with waste from latency of an hour long transfer, or something like that. I'm sure the medium it travels over dictates a lot of this.

Best Answer

Agh .. stirrings .. Shannon ... entropy, channel capacity, information theory, agh ...

You'll be sorry :-)

Short:

The energy per bit to noise power spectral density ratio is greater than or equal to the natural logarithm of two.

Long:

Minimum energy to send k bits with and without feedback- Yury Polyanskiy, H. Vincent Poor, and Sergio Verd´

  • Abstract: The question of minimum achievable energy per bit over memoryless channels has been previously addressed in the limit of number of information bits going to infinity, in which case it is known that availability of noiseless feedback does not lower the minimum energy per bit. This paper analyzes the behavior of the minimum energy per bit for memoryless Gaussian channels as a function of the number of information bits. It is demonstrated that in this non-asymptotic regime, noiseless feedback leads to significantly better energy efficiency. A feedback coding scheme with zero probability of block error and finite energy per bit is constructed. For both achievability and converse, the feedback coding problem is reduced to a sequential hypothesis testing problem for Brownian motion

Wikipedia - Eb/N0 - the energy per bit to noise power spectral density ratio
See section on Shannon limit.

  • The Shannon–Hartley theorem says that the limit of reliable information rate (data rate exclusive of error-correcting codes) of a channel depends on bandwidth and signal-to-noise ratio according to: Where

    • I is the information rate in bits per second excluding error-correcting codes;
    • B is the bandwidth of the channel in hertz;
    • S is the total signal power (equivalent to the carrier power C); and
    • N is the total noise power in the bandwidth.

Useful Wikipedia - Entropy in thermodynamics and information theory

Also Shannon’s Channel Capacity and BER Notes on Shannon’s Limi

and Shannon's Channel Capacity