At 08:31 PM 4/7/2010, James Propp wrote:
There must be existing literature on computers that can be made to run faster and faster.
There is -- it's called computing with "oracles". Assume that you have an "oracle" (like the Oracle of Delphi, get it?) which can answer a complex question in one step, what is the computational complexity of a given task? I don't recall ever having seen an oracle that does infinite carry-lookahead or other infinite precision arithmetic in one step. I would first try to convince myself that it isn't such a powerful operation that computing with it becomes trivial. Vaughn Pratt has an interesting paper from approx 1970 about computing with infinite bitvectors (something about "bitvector machines", perhaps?) in one step. He shows that it enables certain problems to be speeded up quite a bit. I'm sure that this paper is on his web site somewhere. Pratt also discussed solving combinatorial problems of deterministic complexity O(2^n) in polynomial time by utilizing Moore's Law. You simply wait long enough to allow Moore's Law to work its magic before building a machine which is large enough & fast enough to solve the problem quickly. The constant is pretty large, but we've been effectively using this technique on combinatorial problems for the past 40 years. I don't know if Pratt ever wrote down this Moore's Law Speedup algorithm in a paper, though. As Gosper well knows, there are "siphon" algorithms for infinite precision real arithmetic that compute only enough bits to answer the particular question posed. Using these techniques, one can write down the answer to a question, but actually obtaining the bits of the answer may be somewhat more difficult.