Re: [math-fun] Halting probability for Conway's Life
I figured out that the measure I wanted is this one: given a set of patterns where I'm told that the average area is <A>, the distribution on patterns that maximizes the entropy is the Gibbs distribution p(x) = exp(-s A(x)) / Z(s) where Z(s) = sum_x exp(-s A(x)) and s is such that sum_x A(x) p(x) = <A>. The partition function Z gives Chaitin's halting probability when the set of patterns x consists of those that evolve into a cyclic state and s = ln 2. It's clear that not all 2^A patterns of area A can be distinguished by bombarding them with gliders, etc. On the other hand, sliding block memory (http://www.radicaleye.com/lifepage/patterns/sbm/sbm.html) means that the area required to encode n bits is proportional to n. Are there better known bounds on the information encodable into a particular region? The fact that the area is proportional to n means that the halting probability of Life is at least partially random, i.e. given any universal prefix-free Turing machine U, there exists k ≥ 1 and c ≥ 0 such that the shortest program for U that computes the first n bits of the halting probability of Life is at least n/k - c bits long. -- Mike Stay - metaweta@gmail.com http://www.cs.auckland.ac.nz/~mike http://reperiendi.wordpress.com
It's clear that not all 2^A patterns of area A can be distinguished by bombarding them with gliders, etc. On the other hand, sliding block memory (http://www.radicaleye.com/lifepage/patterns/sbm/sbm.html) means that the area required to encode n bits is proportional to n.
Not really: sliding block memory uses an exponential amount of space to store an integer. Nevertheless, there are indeed mechanisms for storing information in linear space.
Are there better known bounds on the information encodable into a particular region?
I've created a pattern which expands at a rate of O(sqrt(log(t))) in diameter -- the slowest possible growth rate -- by binary counting in one octant of the plane. This pattern stores each bit in a 16fd by 16fd box, so it has an information density of one bit per 512 cells. This was a fully interactive memory, which I believe is more than you want. If we're just interested in read-once destructive memory, then we can easily manage 0.0625 bits per cell in the following way: x = 64, y = 41, rule = B3/S23 2o2b2o2b2o30b2o2b2o2b2o$2o2b2o2b2o30b2o2b2o2b2o$20bo39bo$18b2o38b2o$2o 2b2o2b2o9b2o19b2o2b2o2b2o9b2o$2o2b2o2b2o30b2o2b2o2b2o3$2o2b2o34b2o2b2o 2b2o$2o2b2o34b2o2b2o2b2o3$2o2b2o16bo17b2o2b2o16bo$2o2b2o15b2o17b2o2b2o 15b2o$21bobo37bobo2$2o2b2o34b2o2b2o$2o2b2o34b2o2b2o19$15b3o37b3o$15bo 2bo36bo2bo$15bo39bo$15bo39bo$16bobo37bobo! Also, by considering the smallest Garden of Eden pattern, we can create an upper bound on the information density: 0.9999999999999999999999999999942 bits per cell. (This is log(2^100-2^9)/log(2^100), as at least 2^9 of the patterns contained within a 10 by 10 bounding box are GoEs.)
The fact that the area is proportional to n means that the halting probability of Life is at least partially random, i.e. given any universal prefix-free Turing machine U, there exists k ≥ 1 and c ≥ 0 such that the shortest program for U that computes the first n bits of the halting probability of Life is at least n/k - c bits long.
Indeed. Sincerely, Adam P. Goucher
On Sun, Jul 8, 2012 at 5:16 AM, Adam P. Goucher <apgoucher@gmx.com> wrote:
It's clear that not all 2^A patterns of area A can be distinguished by bombarding them with gliders, etc. On the other hand, sliding block memory (http://www.radicaleye.com/lifepage/patterns/sbm/sbm.html) means that the area required to encode n bits is proportional to n.
Not really: sliding block memory uses an exponential amount of space to store an integer. Nevertheless, there are indeed mechanisms for storing information in linear space.
If you only use the first k positions of the block, then it fits in a finite area, so storing n digits in base k takes area proportional to n.
Are there better known bounds on the information encodable into a particular region?
I've created a pattern which expands at a rate of O(sqrt(log(t))) in diameter -- the slowest possible growth rate -- by binary counting in one octant of the plane. This pattern stores each bit in a 16fd by 16fd box, so it has an information density of one bit per 512 cells.
This was a fully interactive memory, which I believe is more than you want. If we're just interested in read-once destructive memory, then we can easily manage 0.0625 bits per cell in the following way:
x = 64, y = 41, rule = B3/S23 2o2b2o2b2o30b2o2b2o2b2o$2o2b2o2b2o30b2o2b2o2b2o$20bo39bo$18b2o38b2o$2o 2b2o2b2o9b2o19b2o2b2o2b2o9b2o$2o2b2o2b2o30b2o2b2o2b2o3$2o2b2o34b2o2b2o 2b2o$2o2b2o34b2o2b2o2b2o3$2o2b2o16bo17b2o2b2o16bo$2o2b2o15b2o17b2o2b2o 15b2o$21bobo37bobo2$2o2b2o34b2o2b2o$2o2b2o34b2o2b2o19$15b3o37b3o$15bo 2bo36bo2bo$15bo39bo$15bo39bo$16bobo37bobo!
Also, by considering the smallest Garden of Eden pattern, we can create an upper bound on the information density:
0.9999999999999999999999999999942 bits per cell.
(This is log(2^100-2^9)/log(2^100), as at least 2^9 of the patterns contained within a 10 by 10 bounding box are GoEs.)
Wonderful, thank you! -- Mike Stay - metaweta@gmail.com http://www.cs.auckland.ac.nz/~mike http://reperiendi.wordpress.com
participants (2)
-
Adam P. Goucher -
Mike Stay