Sad news indeed ... but his work lives on, and fascinating it was. To work out what 1,200 hours of 1976 computing is today, maybe one can just divide by four for every three years, so that's *0.25^12 ... or less than one second. However, the algorithm admitted of 'embarrassing parallelism' whereby many independent parts of the proof (one for each subsetted subgraph) could be carried out in parallel. So, one need not wait even one second if there's a hurry on! Guy
So, is there some modern source-code implementation of this computation that a person can play with? On Apr 29, 2013, at 9:06 AM, Guy Haworth <g.haworth@reading.ac.uk> wrote:
Sad news indeed ... but his work lives on, and fascinating it was.
To work out what 1,200 hours of 1976 computing is today, maybe one can just divide by four for every three years, so that's *0.25^12 ... or less than one second.
However, the algorithm admitted of 'embarrassing parallelism' whereby many independent parts of the proof (one for each subsetted subgraph) could be carried out in parallel.
So, one need not wait even one second if there's a hurry on!
Guy
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com http://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
Sounds like a nice undergrad thesis to re-implement this computation using modern programming styles in a modern programming language. At 09:34 AM 4/29/2013, Thane Plambeck wrote:
So, is there some modern source-code implementation of this computation that a person can play with?
On Apr 29, 2013, at 9:06 AM, Guy Haworth <g.haworth@reading.ac.uk> wrote:
Sad news indeed ... but his work lives on, and fascinating it was.
To work out what 1,200 hours of 1976 computing is today, maybe one can just divide by four for every three years, so that's *0.25^12 ... or less than one second.
However, the algorithm admitted of 'embarrassing parallelism' whereby many independent parts of the proof (one for each subsetted subgraph) could be carried out in parallel.
So, one need not wait even one second if there's a hurry on!
Guy
For those of you who are wondering what a modern reworking of the 4-color-theorem proof might look like, see Georges Gonthier's 2005 "A computer-checked proof of the Four Colour Theorem" here: http://research.microsoft.com/en-us/um/people/gonthier/4colproof.pdf In particular, it is noted on page 10: "… the version of the Coq system we used needed three days to check our proof, whereas Robertson et al. only needed a three hours... ten years ago! (Future releases of Coq should cut our time back to a few hours, however.)"
I never saw what the big deal was about this computer proof. For many years before 1976, computers had been the only entities that were capable of verifying the primality of large Mersenne numbers. Bob Baillie
My impression of this debate is that the computer gives a proof in the sense of a certificate of truth, at least as reliable as those produced by humans. But that what we often want in a "proof" is a sense of _why_ something is true --- this is why we often try to find multiple proofs of the same thing, and why we want "proofs from the book". It seems we don't yet have this kind of proof for the 4-color theorem. On the other hand, I have heard (does anyone have a reference?) that while God has a book of the most elegant and insightful proofs, the Devil has a book of mathematical truths that are easy to state, but which have no proof in God's book. Perhaps the 4-color theorem is one of these... Cris On Apr 29, 2013, at 11:37 AM, Robert Baillie wrote:
I never saw what the big deal was about this computer proof.
For many years before 1976, computers had been the only entities that were capable of verifying the primality of large Mersenne numbers.
Bob Baillie
If you believe the results of complexity theory, the Devil's Book is a helluva lot bigger than God's Book. Perhaps this is what depressed Cantor so much. (Cantor) Dust to dust. At 10:55 AM 4/29/2013, Cris Moore wrote:
On the other hand, I have heard (does anyone have a reference?) that while God has a book of the most elegant and insightful proofs, the Devil has a book of mathematical truths that are easy to state, but which have no proof in God's book. Perhaps the 4-color theorem is one of these...
I think people trusted that computers could safely do arithmetic operations. (Though later chip problems have shown this trust needs to be earned!) But a proof based on running a long computer program is hard to check. I think the problem people had with the proof was that they didn't know how to verify that the computer did what it had to do. For example, the Kepler conjecture was presumably proved in 1998 by Thomas Hales with a complicated proof that partly relied on a similarly long computer program. I think that, after 10 years, a committee of mathematicians was able to give its imprimatur to the non-computer part of the proof, but gave up on trying to verify that the computer did what it was supposed to do. --Dan On 2013-04-29, at 10:37 AM, Robert Baillie wrote:
I never saw what the big deal was about this computer proof.
For many years before 1976, computers had been the only entities that were capable of verifying the primality of large Mersenne numbers.
Bob Baillie
This is from a note I received from Andrew Appel this morning:
You might be interested in this development, all done in the pure functional programming language inside the Coq theorem prover (and proved correct):
http://www.ams.org/notices/200811/tx081101382p.pdf
My mother still has several boxes of original documents from the proof, she was thinking of depositing them in the archives of the U. of Illinois library.
At 10:11 AM 4/29/2013, Hans Havermann wrote:
For those of you who are wondering what a modern reworking of the 4-color-theorem proof might look like, see Georges Gonthier's 2005 "A computer-checked proof of the Four Colour Theorem" here:
http://research.microsoft.com/en-us/um/people/gonthier/4colproof.pdf
In particular, it is noted on page 10: "Â the version of the Coq system we used needed three days to check our proof, whereas Robertson et al. only needed a three hours... ten years ago! (Future releases of Coq should cut our time back to a few hours, however.)"
Dear All, At about the time that A & H did their stuff, I had a post-doc (at my age names disappear) who was working independently on the 4CC. A & H beat him by several months, but this chap's decomposing (?? again, can't rememember the word) system was somewhat more efficient than A & H's, so that his machine time was only about half of their's. Incidentally, in connexion with the argument as to whether it's a proof or not, this independent arrival of the result dispels a good deal of the doubt. R. On Mon, 29 Apr 2013, Henry Baker wrote:
Sounds like a nice undergrad thesis to re-implement this computation using modern programming styles in a modern programming language.
At 09:34 AM 4/29/2013, Thane Plambeck wrote:
So, is there some modern source-code implementation of this computation that a person can play with?
On Apr 29, 2013, at 9:06 AM, Guy Haworth <g.haworth@reading.ac.uk> wrote:
Sad news indeed ... but his work lives on, and fascinating it was.
To work out what 1,200 hours of 1976 computing is today, maybe one can just divide by four for every three years, so that's *0.25^12 ... or less than one second.
However, the algorithm admitted of 'embarrassing parallelism' whereby many independent parts of the proof (one for each subsetted subgraph) could be carried out in parallel.
So, one need not wait even one second if there's a hurry on!
Guy
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com http://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
participants (8)
-
Cris Moore -
Dan Asimov -
Guy Haworth -
Hans Havermann -
Henry Baker -
Richard Guy -
Robert Baillie -
Thane Plambeck