I don't know what the relevant units of computation are --- floating-point operations are the ones I've heard about most, but what if you're not using floating point? --- but I'm wondering if anyone has tried to quantify "how much computing" took place in (a) code-breaking at Bletchley Park, (b) the Manhattan Project, and (c) the Apollo space program.
I don't know the answers to these questions but I suspect that at least the first two are well documented. At BP, there are two kinds of mechanical computation that might account for most of the computing. Bombes are doing table lookups, a sort of counter increment, and compares at mechanical speeds, tens of comparisons a second. Colossi (both the fastest and the most numerous of the anit-Fish machines ran at five thousand characters a second and were doing correlation. That's about three instructions per character (irresponsibly gnoring the IO), a xor, a popc, and an add. There were hundreds of bombes run twenty-four hours a day for, two to three years. There were about ten colossi and they may have run for an average of a year each a what seems like 10K instructions per second. Manhattan project computing was largely done using groups of people with electromechanical calculators. I don't know how to estimate the total but I suspect it is carefully discussed somewhere because those people were closely associated with the computers at the University of Pennsylvania and Princeton that did post-war nuclear calculations.
It seems likely to me that my laptop, ...
Or perhaps your phone or your watch. Whit