I figured out that the measure I wanted is this one: given a set of patterns where I'm told that the average area is <A>, the distribution on patterns that maximizes the entropy is the Gibbs distribution p(x) = exp(-s A(x)) / Z(s) where Z(s) = sum_x exp(-s A(x)) and s is such that sum_x A(x) p(x) = <A>. The partition function Z gives Chaitin's halting probability when the set of patterns x consists of those that evolve into a cyclic state and s = ln 2. It's clear that not all 2^A patterns of area A can be distinguished by bombarding them with gliders, etc. On the other hand, sliding block memory (http://www.radicaleye.com/lifepage/patterns/sbm/sbm.html) means that the area required to encode n bits is proportional to n. Are there better known bounds on the information encodable into a particular region? The fact that the area is proportional to n means that the halting probability of Life is at least partially random, i.e. given any universal prefix-free Turing machine U, there exists k ≥ 1 and c ≥ 0 such that the shortest program for U that computes the first n bits of the halting probability of Life is at least n/k - c bits long. -- Mike Stay - metaweta@gmail.com http://www.cs.auckland.ac.nz/~mike http://reperiendi.wordpress.com