[math-fun] Computational effort
I don't know what the relevant units of computation are --- floating-point operations are the ones I've heard about most, but what if you're not using floating point? --- but I'm wondering if anyone has tried to quantify "how much computing" took place in (a) code-breaking at Bletchley Park, (b) the Manhattan Project, and (c) the Apollo space program. It seems likely to me that my laptop, running some silly number-crunching problem in Mathematica and returning an answer after six hours that I could probably have figured out in my head if I'd thought a little bit more before jumping in to write some code, has them all beat. What do you think? (And while we're on the subject: What's the canonical example of wasted computational effort? Isn't there some well-known old story of someone who had an early computer calculate the determinant of a singular matrix?) Jim Propp
I don't know what the relevant units of computation are --- floating-point operations are the ones I've heard about most, but what if you're not using floating point? --- but I'm wondering if anyone has tried to quantify "how much computing" took place in (a) code-breaking at Bletchley Park, (b) the Manhattan Project, and (c) the Apollo space program.
I don't know the answers to these questions but I suspect that at least the first two are well documented. At BP, there are two kinds of mechanical computation that might account for most of the computing. Bombes are doing table lookups, a sort of counter increment, and compares at mechanical speeds, tens of comparisons a second. Colossi (both the fastest and the most numerous of the anit-Fish machines ran at five thousand characters a second and were doing correlation. That's about three instructions per character (irresponsibly gnoring the IO), a xor, a popc, and an add. There were hundreds of bombes run twenty-four hours a day for, two to three years. There were about ten colossi and they may have run for an average of a year each a what seems like 10K instructions per second. Manhattan project computing was largely done using groups of people with electromechanical calculators. I don't know how to estimate the total but I suspect it is carefully discussed somewhere because those people were closely associated with the computers at the University of Pennsylvania and Princeton that did post-war nuclear calculations.
It seems likely to me that my laptop, ...
Or perhaps your phone or your watch. Whit
I'm wondering if anyone has tried to quantify "how much computing" took place ...
I would like to know the answer to a related question. What does a workfactor of $2^{128}$ (herein after called an oodle) mean. How does it compare to World War II or the Apollo project. The problem is that the best strategy for spending the money is developmental. If you took the cost of WWII and spent it all on developing computers, could you have developed the ability to count to an oodle? By when? If not, how high could you have counted and by when? Whatever result you get, it doesn't seem like a very satisfactory solution to the problem. Whit
Funster Whit Diffie just won the Turing Award. Congratulations, Whit!
Woot! On Tue, Mar 1, 2016 at 1:05 PM, Tom Knight <tk@ginkgobioworks.com> wrote:
Funster Whit Diffie just won the Turing Award. Congratulations, Whit!
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
-- Mike Stay - metaweta@gmail.com http://www.cs.auckland.ac.nz/~mike http://reperiendi.wordpress.com
participants (4)
-
James Propp -
Mike Stay -
Tom Knight -
Whitfield Diffie