[math-fun] Khinchin & Geometric Means
Under 'Applications' here < http://reference.wolfram.com/mathematica/ref/Khinchin.html >, the Mathematica documentation states: Geometric mean of the first 1000 continued fraction terms in Pi… In[1]:= N[Apply[Times,ContinuedFraction[Pi,1000]]^(1/1000)] Out[1]:= 2.66563 In[2]:= N[Khinchin] Out[2]:= 2.68545 Sadly, the example ignores the fact that the first term of a continued fraction is very different from all the rest, a matter that would have been more apparent if one had asked for the geometric mean of the first 1000 continued fraction terms of Pi/4. I have just determined that the geometric mean of 1498931686 terms (cherry-picked) of the *fractional* part of the continued fraction of pi is within 1.002405*10^-13 of Khinchin's constant.
It's amazing that Khinchin found that for almost all positive reals x, the GM of the convergents of its CF expansion are independent of x. Is there a simple reason why this should be true? --Dan On 2012-12-24, at 5:23 AM, Hans Havermann wrote:
Under 'Applications' here < http://reference.wolfram.com/mathematica/ref/Khinchin.html >, the Mathematica documentation states: Geometric mean of the first 1000 continued fraction terms in Pi…
In[1]:= N[Apply[Times,ContinuedFraction[Pi,1000]]^(1/1000)] Out[1]:= 2.66563
In[2]:= N[Khinchin] Out[2]:= 2.68545
Sadly, the example ignores the fact that the first term of a continued fraction is very different from all the rest, a matter that would have been more apparent if one had asked for the geometric mean of the first 1000 continued fraction terms of Pi/4.
I have just determined that the geometric mean of 1498931686 terms (cherry-picked) of the *fractional* part of the continued fraction of pi is within 1.002405*10^-13 of Khinchin's constant.
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com http://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
Dan Asimov:
It's amazing that Khinchin found that for almost all positive reals x, the GM of the convergents of its CF expansion are independent of x.
Is there a simple reason why this should be true?
It's not simple to me but here is Khinchin's translated 1935 argument: http://chesswanks.com/txt/Khinchin.pdf
Thanks, Hans. In fact I saw an argument in Wikipedia (< http://en.wikipedia.org/wiki/Khinchin's_constant >), which is claimed to be much simpler than Khinchin's original reasoning. ((( But it omits explaining the "hard part" -- which is proving that when restricted to the irrational numbers J = (0,1) - Q, the transformation T: J -> J defined via CF convergents as: T([0; a_1, a_2, . . .]) = [0; a_2, a_3, . . .] has no invariant set of intermediate measure* (i.e., not 0 or 1). Of course, T(x) = frac(1/x). Gauss discovered that this function has an invariant measure given by mu(A) = (1/ln(2))*Integral_{A} of 1/(1+x) dx. which is kind of fun to sit down and prove for oneself. ))) --Dan ___________________________________________________________________ *Whether this invariant set's measure is taken as Lebesgue or Gauss's mu makes no difference to its being "of intermediate measure". On 2012-12-24, at 10:59 AM, Hans Havermann wrote:
Dan Asimov:
It's amazing that Khinchin found that for almost all positive reals x, the GM of the convergents of its CF expansion are independent of x.
Is there a simple reason why this should be true?
It's not simple to me but here is Khinchin's translated 1935 argument:
http://chesswanks.com/txt/Khinchin.pdf
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com http://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
I have just determined that the geometric mean of 1498931686 terms (cherry-picked) of the *fractional* part of the continued fraction of pi is within 1.002405*10^-13 of Khinchin's constant.
And just to caution folk further about the nature of cherry-picking the number of terms: The geometric mean of 976 terms is closer to Khinchin's constant than the geometric mean of 2377934394 terms! http://gladhoboexpress.blogspot.ca/2012/12/pi-continued-fraction-khinchin-re...
William Feller's classic book on probability analyses the behaviour of waves --- your "regimes" --- in the sum of a sequence of coin tosses; I think the general heading is something like the "Petersburg paradox". Presumably the sequence of CF means behaves in a similar fashion. Is the corresponding higher-order behaviour known? And how well does your data fit the known modela? WFL On 12/29/12, Hans Havermann <gladhobo@teksavvy.com> wrote:
I have just determined that the geometric mean of 1498931686 terms (cherry-picked) of the *fractional* part of the continued fraction of pi is within 1.002405*10^-13 of Khinchin's constant.
And just to caution folk further about the nature of cherry-picking the number of terms: The geometric mean of 976 terms is closer to Khinchin's constant than the geometric mean of 2377934394 terms!
http://gladhoboexpress.blogspot.ca/2012/12/pi-continued-fraction-khinchin-re... _______________________________________________ math-fun mailing list math-fun@mailman.xmission.com http://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
Is the Petersurg paradox that thing where if two people play coin toss (one player gets +1 for H, -1 for T, and vice versa for the other player), then as the number N of tosses -> oo, one might think that the most likely average score -- in terms of a probability density -- is 0, but in fact it's +-1 (i.e., the arcsin law") ??? --Dan On 2012-12-29, at 5:21 AM, Fred lunnon wrote:
William Feller's classic book on probability analyses the behaviour of waves --- your "regimes" --- in the sum of a sequence of coin tosses; I think the general heading is something like the "Petersburg paradox".
Presumably the sequence of CF means behaves in a similar fashion. Is the corresponding higher-order behaviour known? And how well does your data fit the known modela?
That's the one. Elsewhere called "gambler's ruin", perhaps ... WFL On 12/29/12, Dan Asimov <dasimov@earthlink.net> wrote:
Is the Petersurg paradox that thing where if two people play coin toss (one player gets +1 for H, -1 for T, and vice versa for the other player), then as the number N of tosses -> oo, one might think that the most likely average score -- in terms of a probability density -- is 0, but in fact it's +-1 (i.e., the arcsin law") ???
--Dan
On 2012-12-29, at 5:21 AM, Fred lunnon wrote:
William Feller's classic book on probability analyses the behaviour of waves --- your "regimes" --- in the sum of a sequence of coin tosses; I think the general heading is something like the "Petersburg paradox".
Presumably the sequence of CF means behaves in a similar fashion. Is the corresponding higher-order behaviour known? And how well does your data fit the known modela?
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com http://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
On Sat, Dec 29, 2012 at 2:03 PM, Dan Asimov <dasimov@earthlink.net> wrote:
Is the Petersurg paradox that thing where if two people play coin toss (one player gets +1 for H, -1 for T, and vice versa for the other player), then as the number N of tosses -> oo, one might think that the most likely average score -- in terms of a probability density -- is 0, but in fact it's +-1 (i.e., the arcsin law") ???
The St. Petersburg paradox is not what you describe. The St. Petersburg paradox is when two people play a game where they flip coins until a head is flipped, and if N tails are flipped first, A pays B 2^N dollars, and the question is what B should pay A initially to make this a fair game. I've never heard of a Petersurg paradox, and neither has any web site indexed by Google, as far as I can tell. Andy
--Dan
On 2012-12-29, at 5:21 AM, Fred lunnon wrote:
William Feller's classic book on probability analyses the behaviour of waves --- your "regimes" --- in the sum of a sequence of coin tosses; I think the general heading is something like the "Petersburg paradox".
Presumably the sequence of CF means behaves in a similar fashion. Is the corresponding higher-order behaviour known? And how well does your data fit the known modela?
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com http://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
-- Andy.Latto@pobox.com
(Sorry about my careless typing.) OK, is the St. Petersburg paradox essentially that: If in a coin-matching game, I use the strategy of playing until my first win, where my bet on any given play is 2^N dollars, where N is the largest integer such that I've just lost the last N matches, (with my first bet being 2^0 = 1 dollar), so that upon each win I'm guaranteed a net profit of $1 . . . . . . then how come this doesn't guarantee me a net profit of at least $1, with probability 1, even though coin-matching would seem to be symmetrical, i.e., favoring neither player? --Dan On 2012-12-29, at 10:27 PM, Andy Latto wrote:
On Sat, Dec 29, 2012 at 2:03 PM, Dan Asimov <dasimov@earthlink.net> wrote:
Is the Petersurg paradox that thing where if two people play coin toss (one player gets +1 for H, -1 for T, and vice versa for the other player), then as the number N of tosses -> oo, one might think that the most likely average score -- in terms of a probability density -- is 0, but in fact it's +-1 (i.e., the arcsin law") ???
The St. Petersburg paradox is not what you describe. The St. Petersburg paradox is when two people play a game where they flip coins until a head is flipped, and if N tails are flipped first, A pays B 2^N dollars, and the question is what B should pay A initially to make this a fair game.
I've never heard of a Petersurg paradox, and neither has any web site indexed by Google, as far as I can tell.
Andy
--Dan
On 2012-12-29, at 5:21 AM, Fred lunnon wrote:
William Feller's classic book on probability analyses the behaviour of waves --- your "regimes" --- in the sum of a sequence of coin tosses; I think the general heading is something like the "Petersburg paradox".
Presumably the sequence of CF means behaves in a similar fashion. Is the corresponding higher-order behaviour known? And how well does your data fit the known modela?
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com http://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
-- Andy.Latto@pobox.com
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com http://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
On 12/29/2012 11:41 PM, Dan Asimov wrote:
(Sorry about my careless typing.)
OK, is the St. Petersburg paradox essentially that:
If in a coin-matching game, I use the strategy of playing until my first win, where my bet on any given play is 2^N dollars, where N is the largest integer such that I've just lost the last N matches, (with my first bet being 2^0 = 1 dollar), so that upon each win I'm guaranteed a net profit of $1 . . .
. . . then how come this doesn't guarantee me a net profit of at least $1, with probability 1, even though coin-matching would seem to be symmetrical, i.e., favoring neither player?
Because you only have a finite number of dollars to start with and to guarantee your $1 profit the expected amount you need to bet is infinite. Brent Meeker
Right (as I am well aware). My question is whether that is the St. Petersburg paradox. --Dan On 2012-12-30, at 12:01 AM, meekerdb wrote:
On 12/29/2012 11:41 PM, Dan Asimov wrote:
(Sorry about my careless typing.)
OK, is the St. Petersburg paradox essentially that:
If in a coin-matching game, I use the strategy of playing until my first win, where my bet on any given play is 2^N dollars, where N is the largest integer such that I've just lost the last N matches, (with my first bet being 2^0 = 1 dollar), so that upon each win I'm guaranteed a net profit of $1 . . .
. . . then how come this doesn't guarantee me a net profit of at least $1, with probability 1, even though coin-matching would seem to be symmetrical, i.e., favoring neither player?
Because you only have a finite number of dollars to start with and to guarantee your $1 profit the expected amount you need to bet is infinite.
participants (6)
-
Andy Latto -
Dan Asimov -
Fred lunnon -
Hans Havermann -
James Propp -
meekerdb