I don't know what the relevant units of computation are --- floating-point operations are the ones I've heard about most, but what if you're not using floating point? --- but I'm wondering if anyone has tried to quantify "how much computing" took place in (a) code-breaking at Bletchley Park, (b) the Manhattan Project, and (c) the Apollo space program. It seems likely to me that my laptop, running some silly number-crunching problem in Mathematica and returning an answer after six hours that I could probably have figured out in my head if I'd thought a little bit more before jumping in to write some code, has them all beat. What do you think? (And while we're on the subject: What's the canonical example of wasted computational effort? Isn't there some well-known old story of someone who had an early computer calculate the determinant of a singular matrix?) Jim Propp