I believe that Charles Bennett showed that computation of the sort that goes into Mersenne prime-finding need consume _no_ energy -- i.e., unlike "heat engines", computers have no theoretical physical limit on energy efficiency. However, _throwing away information_ -- e.g., erasing a disk -- takes O(kT) per bit. All that having been said, the current (pun intended) generation of CMOS (130nm going down to 90nm) leaks current like a sieve, and this problem is getting worse -- i.e., the traditional CMOS advantage of low power when a circuit is not being used is quickly going away. So, although the line-widths and voltages are getting smaller, the leakage currents make even low frequency chips consume lots of power. Fractal buffs: the total path length of on-chip wires is now on the order of 10 _miles_ for a single little chip! At 12:45 PM 7/16/2004, Robert Baillie wrote:
Also, is there a bound from information theory that says the generation of X amount of information must consume at least Y watt-hours?
Bob Baillie