Richard Howard <rich@richardehoward.com> wrote:
It seems like going cold is the wrong way to go.
It leaves you cold, huh?
Radiative cooling rate drops as T^4, while energy per erasure drops as T.
Nitpick: In optically thin media, it drops as T^3. That's relevant especially when speaking of computation in the extremely far future. As I said, you'd need very low density material to minimize self- gravitation and the resulting Unruh effect. Also, you'd need to build the computers out of something other than matter as we know it after all the protons decay. (You can make new protons, but it would require energy you don't have.)
Lose big trying to dump energy to the CMB at 3K.
In the distant future, the CMB will asymptotically approach 0K. (That reminds me, one of the things on my to-do list is to work out what the present-day universe would look like from an Earth-like planet deep in intergalactic space. Very cold and dark, you say? Sure. But what if the planet is moving fast enough that the CMB in the direction of travel is blue-shifted to be as bright as the sun is from Earth? Would it look anything like the sun? How wide would it be, and what would its color temperature be? How long, in the planet's frame of reference, until the CMB's temperature dropped significantly? Or would the drag from the radiation slow down the planet before that happened?) To be fair, I've been talking about two very different things: Computation in the extremely distant future and computation in perhaps a century or two once we approach the kT log(2) limit, which we're nowhere near yet.
On the other hand, Robert Forward may have had it right in Dragon's Egg. A good computer would be built out of neutronium matter and run at the rate of nuclear reactions.
Maybe. As far as anyone knows, neutronium is a liquid even at cryogenic temperatures, let alone at megakelvins, albeit one with an absurdly high surface tension. Also, nuclear reactions are rather unpredictable due to quantum effects, release enough energy to blow the computer apart, and mostly aren't much faster than Drexlerian rod logic.
It is easy to dump waste heat at 10^6 K or higher--t^4 becomes pretty big relative to kTln2/erasure.
Yes, if you want *fast* computations, hot and compact is the way to do it. If neutronium computers are possible at all, they would be a good choice if time is expensive and energy is cheap. Even without nuclear energy, you can get plenty of energy from the neutron star's own heat, using space as the heat sink. Neutronium also has an awesome specific heat, so neutron stars would probably take a few trillion years to cool down appreciably. If you're a real speed freak, you might prefer Tipler's future to Dyson's. As I mentioned, Dyson claimed it would be possible to do infinite computation in infinite time with finite total energy in an expanding and cooling universe. The computation would get colder and slower without limit, but so what? What's the hurry? Tipler, by contrast, postulated a contracting and heating universe, one that would end with a Big Crunch. He claimed it would be possible in such a scenario to do infinite computation in a finite time with infinite total energy, by running hotter, faster, and more compactly, without limit. It now appears that the universe will expand and cool forever. But a local Tipler future might be possible by tossing your computer into a large black hole. Too bad it can't convey its results to you unless you travel with it. A third possible ultimate fate of the universe is the Big Rip, in which the Hubble distance approaches zero, perhaps in just a few billion years, and everything is torn apart. As far as I know nobody has yet claimed to have found a way to live forever in such a system.