HGB> Some non floating point units of computational effort: Bitcoin POW (SHA-256 hashes/second) Other POW's https://en.wikipedia.org/wiki/Proof-of-work_system Password cracking Most of these problems are "embarrassingly parallel". Perhaps someone here has some ideas for a non-embarrassingly-parallel unit of computational effort. <hgb How about the Minsky Stock Index? Subj: What is the period of this integer sequence? On Sun, May 4, 2014 at 10:37 AM, Bill Gosper <billgosper@gmail.com> wrote:
0, 1, 7, -2, -8, 3, 14, -4, -16, 5, 21, -6, -24, 7, 28, -7, -25, 7, 27, -7, -26, 7, 26, -6, -19, 5, 18, -4, -12, 3, 10, -2, -5, 1, 2, 0, 2, -1, -6, 3, 16, -5, -22, 7, 30, -8, -30, 8, 30, -7, -23, 6, 22, -5, -16, 4, 14, -3, -9, 2, 6, -1, -2, 1, 5, -1, -3, 1, 4, -1, ...
a[0] = 0; a[1] = 1; a[k_?OddQ] := a[k] = a[k - 2] - Floor[9*a[k - 1]/17]; a[k_?EvenQ] := a[k] = a[k - 2] + Floor[15*a[k - 1]/2]
This is the "Minsky Stock Index" that Corey and Julian ran fruitlessly from a[-10^14] to a[10^14] a few years ago. "If you haven't looked at a problem in the last few years, you haven't looked at it." --Ed Pegg, Jr. There is a slight chance that the period is infinite. (When it reached 18 trillion, we exclaimed AT&T!)
The above recursive definition crashes Mathematica in under a million, even with memoizing. Instead use the iteration In[115]:= NestList[Function[xy, {#, xy[[2]] + Floor[15*#/2]} & [xy[[1]] - Floor[9*xy[[2]]/17]]], {1, 0}, 35] e.g., for {a[-1],a[0}, {a[1],a[2]},...{a[69],a[70]} Out[115]= {{1, 0}, {1, 7}, ..., {-1, -4}}
Background: http://www.blurb.com/books/2172660-minskys-trinskys-3rd-edition Unlike Collatz or twin prime searching, this is a very specific question about very specific quantities, rather than a question about the infinitude of integers. --rwg
So far, nobody(?) has figured out how to distribute this calculation. I conjecture finitude for all these monsters that don't blow right up with an obvious, provable pattern. The bigger they get, the proportionally smaller is the effect of the Floor steps, and thus the closer they get to pure ellipses. I.e. their growth rate slows down, and the density of points they visit will increase with increasing radius. Eventually they will run out of new points, and thus must return to the beginning, by reversibility. (Julian, Neil: Did we ever correlate monstrousness with poor rational approximabillity of the Floorless period?) At 07:16 AM 3/20/2015, James Propp wrote: I don't know what the relevant units of computation are --- floating-point operations are the ones I've heard about most, but what if you're not using floating point? --- but I'm wondering if anyone has tried to quantify "how much computing" took place in (a) code-breaking at Bletchley Park, (b) the Manhattan Project, and (c) the Apollo space program. It seems likely to me that my laptop, running some silly number-crunching problem in Mathematica and returning an answer after six hours that I could probably have figured out in my head if I'd thought a little bit more before jumping in to write some code, has them all beat. What do you think? (And while we're on the subject: What's the canonical example of wasted computational effort? Isn't there some well-known old story of someone who had an early computer calculate the determinant of a singular matrix?) Jim Propp Note that early Mersenne prime hacking was done in floating point. I.e., the big integer routines used floating point because that hardware was so much more optimized than the fixed. What could be more computationally wasteful than those ubiquitous, realistic shooter games? A few minutes on one of those GPUs should dwarf the Manhattan Project. (See Feynman's Surely You're Joking.) But the *cost* of this waste is vanishing by Moore's Law, which does not apply to all the young jellyware that these games tragically waste. --rwg