[math-fun] more less lumpy fun functions
This is the tower-of-exponentials I'm playing with now: fe(x) = 2 sinh( fe( x - tanh( x/2 ) ) ) As x --> +oo, fe(x) --> e^f( x-1 ), as x --> -oo, fe(x) --> -e^-f( x+1 ) (or just -f(-x)), and as x --> 0, fe(x) --> 2 fe( x/2 ). To calculate despite the circular definition, I assume a straight line with some slope when abs(x) < 2^-27, where sinh and tanh are linear enough that the location of the seam is invisible to floating point. Tanh is also arbitrary but "in the family" at least. I wondered whether the function is lumpy, "unstable", or particularly tied to the base e. I decided to compare it to the base-two version of the same function: sinh2 = ( 2^x + 2^-x ) / 2 = sinh( x ln 2 ) tahn2 = ( 2^x - 2^-x ) / ( 2^x + 2^-x ) = tanh( x ln 2 ) f2(x) = (2/(2-ln(2))) sinh2( f2( x - tanh2( x/2 ) ) ) The functions are impossible to visualize with linear or log scales, but their inverse functions make nice scales to graph each other by! So I plot fe^-1(y) vs. f2^-1(y). The theory is that, if I scale the inverse functions so that they match where y = MAX_FLOAT = 2^1024, then fe() and f2() have two different recursion periods, so if they are glitching or specialized to their bases, their glitches shouldn't coincide, and they will rat on each other. The short answer is that, over the range of floating point, the two functions look pretty smooth to each other. I play with the initial slopes and some different visualizations here: http://www.tiac.net/~sw/2010/03/Superbola/superbolavi.pdf & here's the code: http://www.tiac.net/~sw/2010/03/Superbola/superbolavi.py Unlike sinh and sinh2, these two functions aren't just scaled versions of each other. They wobble compared to each other through about a cycle over the range I was able to plot. I suspect that's the beat frequency between the two scaled recursion spacings. To see whether the wobbling increases or decreases over a longer period, I'd need a way to calculate with and convert between towers- of-powers notations-- is that something that's been worked out before? Is it even possible?
From: Fred lunnon <fred.lunnon@gmail.com>
I didn't want to get to involved in detail at this stage, since I'm not sure how relevant it might be. But the essence is in the rider, that "it's desirable (in some sense) to minimise the growth of the derivatives with n".
and
some of the features here remind me of my investigation in 2004 into blending pairs of functions for graphics purposes.
I'm curious about your technique, but fe() and f2() are sort of self-blending. I don't see any rationalization for them, but they get their smoothness nearly naturally. --Steve p.s. "blended" in coffee shops and restaurants means "processed in a blender," which irks me.
participants (1)
-
Steve Witham