Re: [math-fun] A very peculiar paper
From: Marc LeBrun <mlb@well.com>
First define an arbitrary well-ordering on the set of all functions F (thus I guess requiring AC). Let's call this "complexity".
Then, for each t, set w(t) = f[t](t), where f[t] is the "simplest" f such that f[t](tau) = v(tau) for all tau strictly < t.
That is, w(t) has the value at t of the "simplest" f which correctly predicts the past values of v(t) strictly prior to t.
This sounds something like Ray Solomonoff's algorithmic complexity and predictors based on it. He looks at universal Turing machine programs, and their lengths in bits. He builds a machine that gives odds for every possible result by taking all the machines that predicted all the previous values, and weighting each by 2^(-length). Something like sum for all f that predicted {y[1] ... y[t-1] } { if f(t) = y then 2^-length(f) else 0 } P(y,t) = ----------------------------------------------- sum for all f that predicted {y[1] ... y[t-1] } { 2^-length( f ) } The funny thing is you don't feed the past into the machines. Rather, you work with sets of machines that generated the past. Marc's summary sounds the same. Obviously with Solomonoff's method, the simplest function is strongly weighted, but it can also be outvoted by a large number of runners-up agreeing on a different prediction. Turing machines work on finite bit strings, so no real numbers either as input (t values) or output. Of course you can adapt his stuff to, e.g., floating point. For all that, and the intractability, I think Solomonoff's is a nice pinning down of Ockham's razor. He was this far in the...1950's? 1960's? --Steve
participants (1)
-
Steve Witham