Re: [math-fun] Could change of base (binary ?> ternary) speed up computation?
Richard Howard <rich@richardehoward.com> wrote:
It seems like going cold is the wrong way to go.
It leaves you cold, huh?
Radiative cooling rate drops as T^4, while energy per erasure drops as T.
Nitpick: In optically thin media, it drops as T^3. That's relevant especially when speaking of computation in the extremely far future. As I said, you'd need very low density material to minimize self- gravitation and the resulting Unruh effect. Also, you'd need to build the computers out of something other than matter as we know it after all the protons decay. (You can make new protons, but it would require energy you don't have.)
Lose big trying to dump energy to the CMB at 3K.
In the distant future, the CMB will asymptotically approach 0K. (That reminds me, one of the things on my to-do list is to work out what the present-day universe would look like from an Earth-like planet deep in intergalactic space. Very cold and dark, you say? Sure. But what if the planet is moving fast enough that the CMB in the direction of travel is blue-shifted to be as bright as the sun is from Earth? Would it look anything like the sun? How wide would it be, and what would its color temperature be? How long, in the planet's frame of reference, until the CMB's temperature dropped significantly? Or would the drag from the radiation slow down the planet before that happened?) To be fair, I've been talking about two very different things: Computation in the extremely distant future and computation in perhaps a century or two once we approach the kT log(2) limit, which we're nowhere near yet.
On the other hand, Robert Forward may have had it right in Dragon's Egg. A good computer would be built out of neutronium matter and run at the rate of nuclear reactions.
Maybe. As far as anyone knows, neutronium is a liquid even at cryogenic temperatures, let alone at megakelvins, albeit one with an absurdly high surface tension. Also, nuclear reactions are rather unpredictable due to quantum effects, release enough energy to blow the computer apart, and mostly aren't much faster than Drexlerian rod logic.
It is easy to dump waste heat at 10^6 K or higher--t^4 becomes pretty big relative to kTln2/erasure.
Yes, if you want *fast* computations, hot and compact is the way to do it. If neutronium computers are possible at all, they would be a good choice if time is expensive and energy is cheap. Even without nuclear energy, you can get plenty of energy from the neutron star's own heat, using space as the heat sink. Neutronium also has an awesome specific heat, so neutron stars would probably take a few trillion years to cool down appreciably. If you're a real speed freak, you might prefer Tipler's future to Dyson's. As I mentioned, Dyson claimed it would be possible to do infinite computation in infinite time with finite total energy in an expanding and cooling universe. The computation would get colder and slower without limit, but so what? What's the hurry? Tipler, by contrast, postulated a contracting and heating universe, one that would end with a Big Crunch. He claimed it would be possible in such a scenario to do infinite computation in a finite time with infinite total energy, by running hotter, faster, and more compactly, without limit. It now appears that the universe will expand and cool forever. But a local Tipler future might be possible by tossing your computer into a large black hole. Too bad it can't convey its results to you unless you travel with it. A third possible ultimate fate of the universe is the Big Rip, in which the Hubble distance approaches zero, perhaps in just a few billion years, and everything is torn apart. As far as I know nobody has yet claimed to have found a way to live forever in such a system.
A cold, slow universe is an interesting concept. I note that no one has mentioned "The Last Question" by Isaac Asimov (1956) where he explored the idea of long times and dying universes. http://www.multivax.com/last_question.html I wonder when quantum zero point energy will be large enough relative to the remaining energy in the cold universe to make organized life impossible. We may not have infinite time to do the calculations. Back to Warren's comment. Likharev built stuff on a chip, but not "CHIPS". Could count the gates without running out of fingers and toes. Long way from Billions. If you are interested in a more serious superconducting super computer, check out the IBM project. There is a pointer to a summary issue of the IBM journal in 1980 that discusses the technology. It was really a requiem--it was allowed to go quietly into the graveyard of technology dreams. http://www.w2agz.com/Library/Superconductivity/Anacker,%20IBM%20Josephson%20... _________________________ Clock distribution consumes energy even if not using resistors. 1/2 C V^2 F cost to charge the capacitors. Of course, could do it adiabatically, in theory but that comes under the "what kind of magic switch do you use in an adiabatic computer" problem we discussed before. Yes, flash memory uses tunneling, but only as a switch to let charge into capacitive storage. Once the bucket is full, the current stops, independent of the details of the switch.. The exact current/voltage characteristic of the tunnel junction is not used except to bound the write time. In a Josephson Junction gate, the critical current depends exponentially on that tunnel characteristic. Bad news if you are making a billion. Keith--I was thinking of neutronium in the Robert Forward sense--on the surface of a neutron star. Gravitational energy is comparable to the nuclear binding forces--I don't see why it would be unpredictable or threaten to blow up the computer. After all, all of our matter is held together by quantum forces (everything scaled down) and our computers don't often blow up. Aside from being a bit toasty, reactions should be well behaved. --R On Wed, Jan 9, 2019 at 11:16 PM Keith F. Lynch <kfl@keithlynch.net> wrote:
Richard Howard <rich@richardehoward.com> wrote:
It seems like going cold is the wrong way to go.
It leaves you cold, huh?
Radiative cooling rate drops as T^4, while energy per erasure drops as T.
Nitpick: In optically thin media, it drops as T^3. That's relevant especially when speaking of computation in the extremely far future. As I said, you'd need very low density material to minimize self- gravitation and the resulting Unruh effect. Also, you'd need to build the computers out of something other than matter as we know it after all the protons decay. (You can make new protons, but it would require energy you don't have.)
Lose big trying to dump energy to the CMB at 3K.
In the distant future, the CMB will asymptotically approach 0K.
(That reminds me, one of the things on my to-do list is to work out what the present-day universe would look like from an Earth-like planet deep in intergalactic space. Very cold and dark, you say? Sure. But what if the planet is moving fast enough that the CMB in the direction of travel is blue-shifted to be as bright as the sun is from Earth? Would it look anything like the sun? How wide would it be, and what would its color temperature be? How long, in the planet's frame of reference, until the CMB's temperature dropped significantly? Or would the drag from the radiation slow down the planet before that happened?)
To be fair, I've been talking about two very different things: Computation in the extremely distant future and computation in perhaps a century or two once we approach the kT log(2) limit, which we're nowhere near yet.
On the other hand, Robert Forward may have had it right in Dragon's Egg. A good computer would be built out of neutronium matter and run at the rate of nuclear reactions.
Maybe. As far as anyone knows, neutronium is a liquid even at cryogenic temperatures, let alone at megakelvins, albeit one with an absurdly high surface tension. Also, nuclear reactions are rather unpredictable due to quantum effects, release enough energy to blow the computer apart, and mostly aren't much faster than Drexlerian rod logic.
It is easy to dump waste heat at 10^6 K or higher--t^4 becomes pretty big relative to kTln2/erasure.
Yes, if you want *fast* computations, hot and compact is the way to do it. If neutronium computers are possible at all, they would be a good choice if time is expensive and energy is cheap. Even without nuclear energy, you can get plenty of energy from the neutron star's own heat, using space as the heat sink. Neutronium also has an awesome specific heat, so neutron stars would probably take a few trillion years to cool down appreciably.
If you're a real speed freak, you might prefer Tipler's future to Dyson's. As I mentioned, Dyson claimed it would be possible to do infinite computation in infinite time with finite total energy in an expanding and cooling universe. The computation would get colder and slower without limit, but so what? What's the hurry?
Tipler, by contrast, postulated a contracting and heating universe, one that would end with a Big Crunch. He claimed it would be possible in such a scenario to do infinite computation in a finite time with infinite total energy, by running hotter, faster, and more compactly, without limit.
It now appears that the universe will expand and cool forever. But a local Tipler future might be possible by tossing your computer into a large black hole. Too bad it can't convey its results to you unless you travel with it.
A third possible ultimate fate of the universe is the Big Rip, in which the Hubble distance approaches zero, perhaps in just a few billion years, and everything is torn apart. As far as I know nobody has yet claimed to have found a way to live forever in such a system.
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
participants (2)
-
Keith F. Lynch -
Richard Howard