[math-fun] biggest flop of all time
passed along from the Mersenne Prime mailing list ... Rich ---------------------- List-Post: <mailto:prime@hogranch.com> Did anyone notice we are coming up on a new prefix? GIMPS has performed over 800 quintillion floating point operations in the seven years of its existence, 800 exaflop. I calculate that in about four months, we will have collectively performed a sextillion floating point operations, allowing me to legitimately use a prefix I have never actually seen in use, even to describe the absurdly large quantities often associated with computing. GIMPS will soon complete the first zettaflop. spike
When we get a thousand zettaflops, will have a yottaflop. I believe the next prefix after that hasn't even been defined yet. I'd also be curious to know how much energy this 800 exaflops has consumed. Recently, when I started a pure-cpu calculation (no disk accesses), the power consumption on my laptop increased by 12 watts (from 23 to 35). I was surprised it was that much. Also, is there a bound from information theory that says the generation of X amount of information must consume at least Y watt-hours? Bob Baillie At 03:10 PM 7/16/2004, you wrote:
passed along from the Mersenne Prime mailing list ...
Rich
---------------------- List-Post: <mailto:prime@hogranch.com>
Did anyone notice we are coming up on a new prefix? GIMPS has performed over 800 quintillion floating point operations in the seven years of its existence, 800 exaflop.
I calculate that in about four months, we will have collectively performed a sextillion floating point operations, allowing me to legitimately use a prefix I have never actually seen in use, even to describe the absurdly large quantities often associated with computing. GIMPS will soon complete the first zettaflop.
spike
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com http://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
I believe that Charles Bennett showed that computation of the sort that goes into Mersenne prime-finding need consume _no_ energy -- i.e., unlike "heat engines", computers have no theoretical physical limit on energy efficiency. However, _throwing away information_ -- e.g., erasing a disk -- takes O(kT) per bit. All that having been said, the current (pun intended) generation of CMOS (130nm going down to 90nm) leaks current like a sieve, and this problem is getting worse -- i.e., the traditional CMOS advantage of low power when a circuit is not being used is quickly going away. So, although the line-widths and voltages are getting smaller, the leakage currents make even low frequency chips consume lots of power. Fractal buffs: the total path length of on-chip wires is now on the order of 10 _miles_ for a single little chip! At 12:45 PM 7/16/2004, Robert Baillie wrote:
Also, is there a bound from information theory that says the generation of X amount of information must consume at least Y watt-hours?
Bob Baillie
At 03:45 PM 7/16/2004, Robert Baillie wrote:
I'd also be curious to know how much energy this 800 exaflops has consumed. Recently, when I started a pure-cpu calculation (no disk accesses), the power consumption on my laptop increased by 12 watts (from 23 to 35). I was surprised it was that much.
On my P4 desktop, it seems to be 50-55 watts.
Date: Fri, 16 Jul 2004 12:10:42 -0700 (MST) From: Richard Schroeppel <rcs@CS.Arizona.EDU>
passed along from the Mersenne Prime mailing list ...
Rich
---------------------- List-Post: <mailto:prime@hogranch.com>
Did anyone notice we are coming up on a new prefix? GIMPS has performed over 800 quintillion floating point operations in the seven years of its existence, 800 exaflop.
That's a bit more than ten micromoles of computation.
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com http://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
participants (6)
-
Henry Baker -
John McCarthy -
Jud McCranie -
Richard Schroeppel -
Robert Baillie -
Scott Huddleston