A single photon of an electromagnetic wave of wavelength lambda has energy E = h*c/lambda. The energy of a complete wave is computed by multiplying this equation by the wavelength to get: Total energy = E*lambda = h*c = constant. At 10:26 AM 4/7/2014, meekerdb wrote:
On 4/7/2014 9:01 AM, Henry Baker wrote: Shannon's theorems re the information-carrying bandwidth are
entirely classical, and completely ignore the quantum nature of radio/light waves. Thus, de Broglie tells us that long wavelengths contain many more quanta than short wavelengths. (In fact, the number of quanta is essentially the number of Planck lengths contained in the wavelength.) But Shannon tells us that shorter wavelengths can contain _more_ information than longer wavelengths!
The energy of a photon is hf = hc/l , where l is the wavelength. So for total energy E in an EM wave the number of photons is N = El/hc. The number of photons in an EM wave can be any positive integer, regardless of wavelength - it has nothing to do with Planck lengths.
Shannon wasn't really talking about wavelengths, he was talking about bandwidth, the frequency with which you could switch between 1 and 0. But you're right that he treated it as a purely classical problem. From a quantum perspective it takes longer to detect whether there's a photon or not (1 or 0) if it's a low energy photon, so the information transfer rate is lower. But I don't know that any real devices operate down at this quantum limit (maybe radio telescopes?).
I call this the "ultraviolet catastrophe" of information theory.