Re: [math-fun] Fourier transforms
The sinusoids are orthogonal to one another, allowing for the construction of excellent filters to detect a strong signal. The real exponentials aren't orthogonal, at least under the traditional defn of inner product. Perhaps the cleverness of Laplace was in coming up with a different inner product? At 04:49 AM 4/28/2007, Daniel Asimov wrote:
Henry writes:
<< Fourier analysis can easily separate out the various sinusoidal frequencies in a waveform.
Is there an analogous analysis that can separate out various real exponentials in a real waveform?
I.e., if a signal is the sum of various real exponentials (i.e., no sinusoidal components), is there a simple analysis that will pull out the coefficients & exponents? Is there a "fast" version analogous to the FFT for this procedure?
I recall studying Laplace transforms, but can't recall whether they solve this particular problem.
It would indeed seem that the Laplace transform is what you're looking for; see < http://en.wikipedia.org/wiki/Laplace_transform >. Since the (bilateral) Laplace transform is equivalent to the Fourier transform under a change of variables (real <-> imaginary), there by rights should be a corresponding fast [or finite] Laplace transform. (Cf. V. Rokhlin, "A fast algorithm for the discrete Laplace transform", J. Complexity, v. 4, 1988.)
In simple discrete cases -- say where you already know
f(x) == sum_{n=-oo to oo} c_n exp(nx)
then integrating f(x)*exp(-n_0 x) over the imaginary interval [0, 2pi*i] will equal 2pi*i c_n (modulo technical details).
--Dan
I'm a bit rusty about all this, but--with respect to the objective of extracting an exponential component--referring to http://mathworld.wolfram.com/LaplaceTransform.html the table shows that the inverse transform of an exponential is a delta function, so along with the linearity this seems to be telling us that the "ILT" in principle does what we want. In practice of course it may involve non-trivial effort to get usable results eg due to "ill-conditioning".
The real exponentials aren't orthogonal, at least under the traditional defn of inner product. Perhaps the cleverness of Laplace was in coming up with a different inner product?
As I recall, orthogonality in general is typically defined involving a "kernel" or "weight" function. Changing the variable produces another transform and changes this function--which can include making it drop out, although it's effectively "still there"--along with other modifications, such as distorting the path of integration. For example a discrete sum of exponentials can be viewed as a power series with x^n --> e^nt. The Hadamard transform then uses the same underlying orthogonality as the Fourier and Laplace transforms to pick out the x^n coefficient in a power series--the only difference is the weight function and the path of integration--which entails an excursion out into the complex plane even though the resulting value is purely real. Orthogonal transforms often exhibit these kinds of dualities, eg time-limited functions must be unbounded in frequency, etc. So even if the input-->output is purely real-->real the intermediate calculations may get complicated. If the basis functions we want to expand in are *truly* non-orthogonal then expansion can be non-unique--for example there might be only one inverse with bounded discrete frequencies, but also an infinite cloud of "neighbor" expansions with uncountably many fractional frequencies. These ghostly "off line" attractors can bollix convergence.
participants (2)
-
Henry Baker -
Marc LeBrun