Good point, Gene. It's also true that since almost any n real numbers B_1, ..., B_n are linearly independent over the integers, almost any n sine waves of form f(x) = A sin(B x) will be pairwise orthogonal. Then RMS(f)^2 = lim_[T —> oo] (1/T) Integral_0 ^T (Sum A_j sin(B_j x))^2 dx the cross terms average to 0, giving RMS(f) = sqrt(1/2) (|A_1| + ... + |A_n|). It seems to me that in some appropriate sense this probably has a version that is true "almost always" for infinitely many terms as well — possibly requiring the condition that the set of periods is bounded. —Dan ----- Actually, the conjecture for sums of multiple sine waves is false. What is true is that the mean square MS = RMS^2 of the sum equals the sum of the mean squares of the component sine waves, with the proviso that the components are orthogonal. The peak value has no bearing on the matter. Consider sin(x)+sin(2x). Lazy guy that I am, I used a spreadsheet to find the peak value 1.76. This divided by sqrt(2) is 1.24. But the RMS of the sum is sqrt(1/2 + 1/2) = 1. On Sunday, November 12, 2017, 1:56:12 PM PST, Keith F. Lynch <kfl@KeithLynch.net> wrote: The root mean square (RMS) of a sine wave is always the peak value divided by the square root of 2. The same is true of the sums of multiple sine waves with different frequencies, phases, and amplitudes. But every repeating waveform is equal to the sum of sine waves with different frequencies, phases, and amplitudes. This includes the square wave, i.e. a function which always equals +X or -X, and never takes any other value. But obviously the RMS of that square wave is simply X, not X divided by the square root of 2. Explain. I learn a lot by coming up with such paradoxes then figuring out the solution. -----
Dan, you dropped the square on the A_j in your final equation. -- Gene On Sunday, November 12, 2017, 5:20:43 PM PST, Dan Asimov <dasimov@earthlink.net> wrote: Good point, Gene. It's also true that since almost any n real numbers B_1, ..., B_n are linearly independent over the integers, almost any n sine waves of form f(x) = A sin(B x) will be pairwise orthogonal. Then RMS(f)^2 = lim_[T —> oo] (1/T) Integral_0 ^T (Sum A_j sin(B_j x))^2 dx the cross terms average to 0, giving RMS(f) = sqrt(1/2) (|A_1| + ... + |A_n|). It seems to me that in some appropriate sense this probably has a version that is true "almost always" for infinitely many terms as well — possibly requiring the condition that the set of periods is bounded. —Dan ----- Actually, the conjecture for sums of multiple sine waves is false. What is true is that the mean square MS = RMS^2 of the sum equals the sum of the mean squares of the component sine waves, with the proviso that the components are orthogonal. The peak value has no bearing on the matter. Consider sin(x)+sin(2x). Lazy guy that I am, I used a spreadsheet to find the peak value 1.76. This divided by sqrt(2) is 1.24. But the RMS of the sum is sqrt(1/2 + 1/2) = 1. On Sunday, November 12, 2017, 1:56:12 PM PST, Keith F. Lynch <kfl@KeithLynch.net> wrote: The root mean square (RMS) of a sine wave is always the peak value divided by the square root of 2. The same is true of the sums of multiple sine waves with different frequencies, phases, and amplitudes. But every repeating waveform is equal to the sum of sine waves with different frequencies, phases, and amplitudes. This includes the square wave, i.e. a function which always equals +X or -X, and never takes any other value. But obviously the RMS of that square wave is simply X, not X divided by the square root of 2. Explain. I learn a lot by coming up with such paradoxes then figuring out the solution. -----
participants (2)
-
Dan Asimov -
Eugene Salamin