I think we are all agreed that most of the visible aliasing we see in the 50,000-dot image is due to pixel quantization. But Dan Asimov argues that even if we could get rid of pixel quantization, we would still see some moire-like effects. If I understand his argument, he is saying that in each area of the phi-based sunflower, the array of dots approximates the vertices of a lattice of parallelograms; the axis vectors of this lattice distort slowly as you move to nearby areas, and occasionally snap to a different set of axes. He anticipates "phase transitions" between domains governed by different axis vectors, and expects that these transitions will appear as visible discontinuities. I agree (again, hedging that I might not be following Dan's thoughts correctly) that different regions have different natural coordinate systems, but I disagree that the transitions will be abrupt or visible. Instead, I expect them to shade into each other imperceptibly; along the borders between these domains there will be regions that appear ambiguous, where one will be able to choose semiconsciouly (as in the Necker illusion) which lattice one sees. On Thu, Mar 22, 2012 at 12:00 PM, Tom Rokicki <rokicki@gmail.com> wrote:
Just a very minor correction: frequencies just above half the sampling frequency get mirrored into *high* frequencies just under half the sampling frequency. Frequencies close to the sampling frequency get aliased into very low frequencies.
On Thu, Mar 22, 2012 at 7:27 AM, Henry Baker <hbaker1@pipeline.com> wrote:
In particular, frequencies just above 1/2 the sampling frequency become very low frequencies on reconstruction.
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com http://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun