[math-fun] Quasirandomness question
Define a bit-guessing algorithm to be any algorithm that, given a finite string of bits as input, predict what the next bit will be; define the bit-guessing game as an adversarial game in which Player A chooses a bit-guessing algorithm, Player B chooses an infinite sequence of bits, and the score is the asymptotic density (maybe I should say upper density or lower density?) of the set of n’s for which A’s algorithm correctly predicts B’s nth bit. Question: If Player B is given complete knowledge of the innards of a million bit-guessing algorithms, can B construct an infinite sequence of bits whose score lies between .49 and .51 for *all* of those algorithms? Of course by “a million” I mean “n”, and by .49 and .51 I mean one-half plus epsilon and one-half minus epsilon, with suitable quantifiers. I want a (nonrandom) construction, not an existence proof. We’ll allow B to consult an oracle that will answer questions about those million bit-guessing algorithms (e.g., the oracle will reveal the truth-status of the Riemann Hypothesis if that’s germane). This is probably an old problem, but I’m not sure how to search for it. Jim Propp
I want a (nonrandom) construction, not an existence proof.
I'm interpreting "construction" as metamathematical, not limiting the answer to constructive mathematics, and I'm interpreting this as "don't choose a bit string at random", not "don't use algorithmic randomness". Chaitin's Omega number, like any algorithmically random real, has the property that the shortest program that outputs the first n bits of Omega has length greater or equal to n - c, where c depends only on the choice of prefix-free universal Turing machine. In the limit of large n, none of the bit predictors can do better than chance. On Thu, Oct 3, 2019 at 5:44 AM James Propp <jamespropp@gmail.com> wrote:
Define a bit-guessing algorithm to be any algorithm that, given a finite string of bits as input, predict what the next bit will be; define the bit-guessing game as an adversarial game in which Player A chooses a bit-guessing algorithm, Player B chooses an infinite sequence of bits, and the score is the asymptotic density (maybe I should say upper density or lower density?) of the set of n’s for which A’s algorithm correctly predicts B’s nth bit.
Question: If Player B is given complete knowledge of the innards of a million bit-guessing algorithms, can B construct an infinite sequence of bits whose score lies between .49 and .51 for *all* of those algorithms? Of course by “a million” I mean “n”, and by .49 and .51 I mean one-half plus epsilon and one-half minus epsilon, with suitable quantifiers.
I want a (nonrandom) construction, not an existence proof.
We’ll allow B to consult an oracle that will answer questions about those million bit-guessing algorithms (e.g., the oracle will reveal the truth-status of the Riemann Hypothesis if that’s germane).
This is probably an old problem, but I’m not sure how to search for it.
Jim Propp _______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
-- Mike Stay - metaweta@gmail.com http://math.ucr.edu/~mike https://reperiendi.wordpress.com
Good point. That strategy works not just against a million adversaries but against all of them at once! Can anyone come up with something a bit more constructive? Maybe instead of trying to trounce a million adversaries, we could think about trying to defeat just two. Jim Propp On Thu, Oct 3, 2019 at 9:40 AM Mike Stay <metaweta@gmail.com> wrote:
I want a (nonrandom) construction, not an existence proof.
I'm interpreting "construction" as metamathematical, not limiting the answer to constructive mathematics, and I'm interpreting this as "don't choose a bit string at random", not "don't use algorithmic randomness".
Chaitin's Omega number, like any algorithmically random real, has the property that the shortest program that outputs the first n bits of Omega has length greater or equal to n - c, where c depends only on the choice of prefix-free universal Turing machine. In the limit of large n, none of the bit predictors can do better than chance.
On Thu, Oct 3, 2019 at 5:44 AM James Propp <jamespropp@gmail.com> wrote:
Define a bit-guessing algorithm to be any algorithm that, given a finite string of bits as input, predict what the next bit will be; define the bit-guessing game as an adversarial game in which Player A chooses a bit-guessing algorithm, Player B chooses an infinite sequence of bits,
and
the score is the asymptotic density (maybe I should say upper density or lower density?) of the set of n’s for which A’s algorithm correctly predicts B’s nth bit.
Question: If Player B is given complete knowledge of the innards of a million bit-guessing algorithms, can B construct an infinite sequence of bits whose score lies between .49 and .51 for *all* of those algorithms? Of course by “a million” I mean “n”, and by .49 and .51 I mean one-half plus epsilon and one-half minus epsilon, with suitable quantifiers.
I want a (nonrandom) construction, not an existence proof.
We’ll allow B to consult an oracle that will answer questions about those million bit-guessing algorithms (e.g., the oracle will reveal the truth-status of the Riemann Hypothesis if that’s germane).
This is probably an old problem, but I’m not sure how to search for it.
Jim Propp _______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
-- Mike Stay - metaweta@gmail.com http://math.ucr.edu/~mike https://reperiendi.wordpress.com
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
Consider the following constructive algorithm for determining the nth bit, given the choices already made for the first n-1 bits: Compute the prediction accuracy for all million predictors on the first n-1 bits (order the predictors and choose the earliest tying predictor in this order to break ties), and see which one has a prediction accuracy furthest from .5. Choose the nth bit to make the accuracy of this predictor (very slightly) closer to .5. My intuition says that this greedy algorithm works for all epsilon, but haven't been able to prove it (except for a single predictor, where the proof is trivial). In the case of two predictors, it seems like the worst case is that at some point, one predictor is at (say) .550001 and the other is at .45, and from then on, any choice that makes the high predictor lower also makes the low predictor higher, so that you can't move one towards .5 without moving the other away from .5 But that just means that from that point on, both predictors are always making the same predictions. Then the greedy algorithm will alternate making the predictors wrong and right, and both will converge to .5, despite the bad things that happened at the beginning. I haven't yet been able to make this intuition into a formal proof, but it convinces me, at least for two predictors, and probably for N. Andy On Thu, Oct 3, 2019 at 10:18 AM James Propp <jamespropp@gmail.com> wrote:
Good point. That strategy works not just against a million adversaries but against all of them at once!
Can anyone come up with something a bit more constructive?
Maybe instead of trying to trounce a million adversaries, we could think about trying to defeat just two.
Jim Propp
On Thu, Oct 3, 2019 at 9:40 AM Mike Stay <metaweta@gmail.com> wrote:
I want a (nonrandom) construction, not an existence proof.
I'm interpreting "construction" as metamathematical, not limiting the answer to constructive mathematics, and I'm interpreting this as "don't choose a bit string at random", not "don't use algorithmic randomness".
Chaitin's Omega number, like any algorithmically random real, has the property that the shortest program that outputs the first n bits of Omega has length greater or equal to n - c, where c depends only on the choice of prefix-free universal Turing machine. In the limit of large n, none of the bit predictors can do better than chance.
On Thu, Oct 3, 2019 at 5:44 AM James Propp <jamespropp@gmail.com> wrote:
Define a bit-guessing algorithm to be any algorithm that, given a finite string of bits as input, predict what the next bit will be; define the bit-guessing game as an adversarial game in which Player A chooses a bit-guessing algorithm, Player B chooses an infinite sequence of bits,
and
the score is the asymptotic density (maybe I should say upper density or lower density?) of the set of n’s for which A’s algorithm correctly predicts B’s nth bit.
Question: If Player B is given complete knowledge of the innards of a million bit-guessing algorithms, can B construct an infinite sequence of bits whose score lies between .49 and .51 for *all* of those algorithms? Of course by “a million” I mean “n”, and by .49 and .51 I mean one-half plus epsilon and one-half minus epsilon, with suitable quantifiers.
I want a (nonrandom) construction, not an existence proof.
We’ll allow B to consult an oracle that will answer questions about those million bit-guessing algorithms (e.g., the oracle will reveal the truth-status of the Riemann Hypothesis if that’s germane).
This is probably an old problem, but I’m not sure how to search for it.
Jim Propp _______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
-- Mike Stay - metaweta@gmail.com http://math.ucr.edu/~mike https://reperiendi.wordpress.com
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
-- Andy.Latto@pobox.com
This suggests a game where I am given a sequence of short vectors (which I know ahead of time) and I get to choose whether to add or subtract each successive vector, with the goal of keeping the running total in some bounded region. Oh, and now I remember where I’ve seen this before! It’s Joel Spencer’s pusher-versus-chooser game: https://books.google.com/books?id=1fPRQS_zqukC&pg=PA32&lpg=PA32&dq=pusher+ch... But the theorem given in the book is for the L-2 norm, whereas what I’m asking about is for the L-infinity norm. I’ll ask Joel. Jim Propp On Thu, Oct 3, 2019 at 11:45 AM Andy Latto <andy.latto@pobox.com> wrote:
Consider the following constructive algorithm for determining the nth bit, given the choices already made for the first n-1 bits:
Compute the prediction accuracy for all million predictors on the first n-1 bits (order the predictors and choose the earliest tying predictor in this order to break ties), and see which one has a prediction accuracy furthest from .5. Choose the nth bit to make the accuracy of this predictor (very slightly) closer to .5.
My intuition says that this greedy algorithm works for all epsilon, but haven't been able to prove it (except for a single predictor, where the proof is trivial).
In the case of two predictors, it seems like the worst case is that at some point, one predictor is at (say) .550001 and the other is at .45, and from then on, any choice that makes the high predictor lower also makes the low predictor higher, so that you can't move one towards .5 without moving the other away from .5 But that just means that from that point on, both predictors are always making the same predictions. Then the greedy algorithm will alternate making the predictors wrong and right, and both will converge to .5, despite the bad things that happened at the beginning.
I haven't yet been able to make this intuition into a formal proof, but it convinces me, at least for two predictors, and probably for N.
Andy
On Thu, Oct 3, 2019 at 10:18 AM James Propp <jamespropp@gmail.com> wrote:
Good point. That strategy works not just against a million adversaries
but
against all of them at once!
Can anyone come up with something a bit more constructive?
Maybe instead of trying to trounce a million adversaries, we could think about trying to defeat just two.
Jim Propp
On Thu, Oct 3, 2019 at 9:40 AM Mike Stay <metaweta@gmail.com> wrote:
I want a (nonrandom) construction, not an existence proof.
I'm interpreting "construction" as metamathematical, not limiting the answer to constructive mathematics, and I'm interpreting this as "don't choose a bit string at random", not "don't use algorithmic randomness".
Chaitin's Omega number, like any algorithmically random real, has the property that the shortest program that outputs the first n bits of Omega has length greater or equal to n - c, where c depends only on the choice of prefix-free universal Turing machine. In the limit of large n, none of the bit predictors can do better than chance.
On Thu, Oct 3, 2019 at 5:44 AM James Propp <jamespropp@gmail.com> wrote:
Define a bit-guessing algorithm to be any algorithm that, given a
finite
string of bits as input, predict what the next bit will be; define the bit-guessing game as an adversarial game in which Player A chooses a bit-guessing algorithm, Player B chooses an infinite sequence of bits, and the score is the asymptotic density (maybe I should say upper density or lower density?) of the set of n’s for which A’s algorithm correctly predicts B’s nth bit.
Question: If Player B is given complete knowledge of the innards of a million bit-guessing algorithms, can B construct an infinite sequence of bits whose score lies between .49 and .51 for *all* of those algorithms? Of course by “a million” I mean “n”, and by .49 and .51 I mean one-half plus epsilon and one-half minus epsilon, with suitable quantifiers.
I want a (nonrandom) construction, not an existence proof.
We’ll allow B to consult an oracle that will answer questions about those million bit-guessing algorithms (e.g., the oracle will reveal the truth-status of the Riemann Hypothesis if that’s germane).
This is probably an old problem, but I’m not sure how to search for it.
Jim Propp _______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
-- Mike Stay - metaweta@gmail.com http://math.ucr.edu/~mike https://reperiendi.wordpress.com
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
-- Andy.Latto@pobox.com
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
Given k predictors and n bits to predict, that seems to imply a linear system of the form: p11 xor b1 + p12 xor b2 + ... + p1n xor bn = n/2 ... pk1 xor b1 + pk2 xor b2 + ... + pkn xor bn = n/2 where pki means the i-th prediction of predictor k, each pki xor bi is done in Z_2, the other operations are done in the rationals, and the bi are the bits sought to defeat all predictors. On 10/3/19 7:18 , James Propp wrote:
Good point. That strategy works not just against a million adversaries but against all of them at once!
Can anyone come up with something a bit more constructive?
Maybe instead of trying to trounce a million adversaries, we could think about trying to defeat just two.
Jim Propp
On Thu, Oct 3, 2019 at 9:40 AM Mike Stay <metaweta@gmail.com> wrote:
I want a (nonrandom) construction, not an existence proof.
I'm interpreting "construction" as metamathematical, not limiting the answer to constructive mathematics, and I'm interpreting this as "don't choose a bit string at random", not "don't use algorithmic randomness".
Chaitin's Omega number, like any algorithmically random real, has the property that the shortest program that outputs the first n bits of Omega has length greater or equal to n - c, where c depends only on the choice of prefix-free universal Turing machine. In the limit of large n, none of the bit predictors can do better than chance.
On Thu, Oct 3, 2019 at 5:44 AM James Propp <jamespropp@gmail.com> wrote:
Define a bit-guessing algorithm to be any algorithm that, given a finite string of bits as input, predict what the next bit will be; define the bit-guessing game as an adversarial game in which Player A chooses a bit-guessing algorithm, Player B chooses an infinite sequence of bits,
and
the score is the asymptotic density (maybe I should say upper density or lower density?) of the set of n’s for which A’s algorithm correctly predicts B’s nth bit.
Question: If Player B is given complete knowledge of the innards of a million bit-guessing algorithms, can B construct an infinite sequence of bits whose score lies between .49 and .51 for *all* of those algorithms? Of course by “a million” I mean “n”, and by .49 and .51 I mean one-half plus epsilon and one-half minus epsilon, with suitable quantifiers.
I want a (nonrandom) construction, not an existence proof.
We’ll allow B to consult an oracle that will answer questions about those million bit-guessing algorithms (e.g., the oracle will reveal the truth-status of the Riemann Hypothesis if that’s germane).
This is probably an old problem, but I’m not sure how to search for it.
Jim Propp _______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
-- Mike Stay - metaweta@gmail.com http://math.ucr.edu/~mike https://reperiendi.wordpress.com
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
Having produced a finite sequence of bits, to choose the next bit Player B runs all n algorithms and chooses the next bit value that will move the algorithm whose success rate is furthest from 0.5 toward 0.5. Brent On 10/3/2019 4:43 AM, James Propp wrote:
Define a bit-guessing algorithm to be any algorithm that, given a finite string of bits as input, predict what the next bit will be; define the bit-guessing game as an adversarial game in which Player A chooses a bit-guessing algorithm, Player B chooses an infinite sequence of bits, and the score is the asymptotic density (maybe I should say upper density or lower density?) of the set of n’s for which A’s algorithm correctly predicts B’s nth bit.
Question: If Player B is given complete knowledge of the innards of a million bit-guessing algorithms, can B construct an infinite sequence of bits whose score lies between .49 and .51 for *all* of those algorithms? Of course by “a million” I mean “n”, and by .49 and .51 I mean one-half plus epsilon and one-half minus epsilon, with suitable quantifiers.
I want a (nonrandom) construction, not an existence proof.
We’ll allow B to consult an oracle that will answer questions about those million bit-guessing algorithms (e.g., the oracle will reveal the truth-status of the Riemann Hypothesis if that’s germane).
This is probably an old problem, but I’m not sure how to search for it.
Jim Propp _______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
I don’t see why this will work. If there’s a tie for success-rate-furthest-from-0.5, moving one of them toward .5 might move the other farther away. Jim Propp On Thu, Oct 3, 2019 at 1:24 PM Brent Meeker via math-fun < math-fun@mailman.xmission.com> wrote:
Having produced a finite sequence of bits, to choose the next bit Player B runs all n algorithms and chooses the next bit value that will move the algorithm whose success rate is furthest from 0.5 toward 0.5.
Brent
On 10/3/2019 4:43 AM, James Propp wrote:
Define a bit-guessing algorithm to be any algorithm that, given a finite string of bits as input, predict what the next bit will be; define the bit-guessing game as an adversarial game in which Player A chooses a bit-guessing algorithm, Player B chooses an infinite sequence of bits, and the score is the asymptotic density (maybe I should say upper density or lower density?) of the set of n’s for which A’s algorithm correctly predicts B’s nth bit.
Question: If Player B is given complete knowledge of the innards of a million bit-guessing algorithms, can B construct an infinite sequence of bits whose score lies between .49 and .51 for *all* of those algorithms? Of course by “a million” I mean “n”, and by .49 and .51 I mean one-half plus epsilon and one-half minus epsilon, with suitable quantifiers.
I want a (nonrandom) construction, not an existence proof.
We’ll allow B to consult an oracle that will answer questions about those million bit-guessing algorithms (e.g., the oracle will reveal the truth-status of the Riemann Hypothesis if that’s germane).
This is probably an old problem, but I’m not sure how to search for it.
Jim Propp _______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
On Fri, Oct 4, 2019 at 1:39 PM James Propp <jamespropp@gmail.com> wrote:
I don’t see why this will work. If there’s a tie for success-rate-furthest-from-0.5, moving one of them toward .5 might move the other farther away.
Yes. 'and if the first one moves A closer and B further away, then B will be further, and the next one will move B closer. Suppose it moves A further away, which feels like the worst case. Then we're back in a tie where we started. Repeat this for two million more guesses, and each of A and B has a million more right and a million more wrong, and both of their averages are now way closer to .5 than they used to be. On the other hand, if sometimes we make a guess that moves both A and B closer to .5, that seems even better, and like it will make both ratios converge to .5 faster. This isn't a formal proof, but do you see the intuition here? In fact, the pusher-chooser argument is exactly the solution to this problem. With N choosers, after k bits, the point in R^n (actually in Z^n) gives the excess of right over wrong guesses by the n predictors. The direction of the pusher will always have all it's coordinates +1 or -1, because predicting a 1 next will increase the excess for some predictors and decrease it for others, and predicting a -1 will have the exact opposite effect, so you have a choice of adding either a certain vector of length sqrt(N) or it;s negative to the vector that's tracking the excesses. Using the greedy algorithm of choosing the direction that keeps things closest to the origin, as shown in the reference you gave, means that after k steps, the distance to the origin is sqrt(kN) or less, so all the coordinates are sqrt(kN) or less. If a given coordinate has value x after k steps, then the accuracy ratio is (k/2 + x) / k = 1/2 + x/k. SInce x = O(sqrt k) (N being constant) this approaches 1/2 as k approaches infinity. QED. My suggested algorithm was to choose the direction at each step by minimizing the sup norm, rather than the L2 norm. I think this also works, though the convergence is slower. Andy
Jim Propp
On Thu, Oct 3, 2019 at 1:24 PM Brent Meeker via math-fun < math-fun@mailman.xmission.com> wrote:
Having produced a finite sequence of bits, to choose the next bit Player B runs all n algorithms and chooses the next bit value that will move the algorithm whose success rate is furthest from 0.5 toward 0.5.
Brent
On 10/3/2019 4:43 AM, James Propp wrote:
Define a bit-guessing algorithm to be any algorithm that, given a finite string of bits as input, predict what the next bit will be; define the bit-guessing game as an adversarial game in which Player A chooses a bit-guessing algorithm, Player B chooses an infinite sequence of bits, and the score is the asymptotic density (maybe I should say upper density or lower density?) of the set of n’s for which A’s algorithm correctly predicts B’s nth bit.
Question: If Player B is given complete knowledge of the innards of a million bit-guessing algorithms, can B construct an infinite sequence of bits whose score lies between .49 and .51 for *all* of those algorithms? Of course by “a million” I mean “n”, and by .49 and .51 I mean one-half plus epsilon and one-half minus epsilon, with suitable quantifiers.
I want a (nonrandom) construction, not an existence proof.
We’ll allow B to consult an oracle that will answer questions about those million bit-guessing algorithms (e.g., the oracle will reveal the truth-status of the Riemann Hypothesis if that’s germane).
This is probably an old problem, but I’m not sure how to search for it.
Jim Propp _______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
-- Andy.Latto@pobox.com
Instead choose a score S=(X-Y)/N, where X+Y=N, after N bits and X matching. This is just a linear scaling, but moving the target to zero seems to make analysis easier. For example, it allows us to write: S=N/M*(X-Y)/N + (Z1+Z2+Z3+...)/M, with additional choices Zm=+/-1 after N up to M. The first term is always zero in the large M limit, so in some sense, initial values never really matter. — Brad
On Oct 4, 2019, at 1:18 PM, Andy Latto <andy.latto@pobox.com> wrote:
On Fri, Oct 4, 2019 at 1:39 PM James Propp <jamespropp@gmail.com> wrote:
I don’t see why this will work. If there’s a tie for success-rate-furthest-from-0.5, moving one of them toward .5 might move the other farther away.
Yes. 'and if the first one moves A closer and B further away, then B will be further, and the next one will move B closer. Suppose it moves A further away, which feels like the worst case. Then we're back in a tie where we started. Repeat this for two million more guesses, and each of A and B has a million more right and a million more wrong, and both of their averages are now way closer to .5 than they used to be. On the other hand, if sometimes we make a guess that moves both A and B closer to .5, that seems even better, and like it will make both ratios converge to .5 faster.
This isn't a formal proof, but do you see the intuition here?
participants (6)
-
Andres Valloud -
Andy Latto -
bradklee@gmail.com -
Brent Meeker -
James Propp -
Mike Stay