On Sunday 06 April 2008, Jim Propp wrote:
But does anyone know of an appropriately general THEOREM that validates this method? That is, can anyone point me to a theorem (whose statement should be accessible to bright first-year college students) saying that if f and g are functions satisfying certain conditions, then there is a one-parameter family of solutions to dy/dx = f(x) g(y) obtained by the formula y = H^{-1}(F(x)+C), where F is an antiderivative of f, C is an arbitrary constant, and H is an antiderivative of 1/g(y)?
Well, um, why not Just Do It, and see what conditions come out? Suppose H'g = 1 and F' = f, and suppose H is invertible where it needs to be. Then write y(x) = H^{-1}(F(x)+c), so that H(y(x)) = F(x)+c. Then y'(x) H'(y(x)) = F'(x), so y'(x) H'(y(x)) g(y(x)) = f(x) g(y(x)), so y'(x) = f(x) g(y(x)), so y is a solution to the differential equation. Now, what did we assume? - That each F(x)+c has a preimage under H. So: let's suppose we're looking for a solution for x in some set A; write B := F(A)+c; then we want the image of H to contain B. - That the resulting y is actually differentiable. So it's not really enough for those preimages to exist; we need to be able to make them into a differentiable inverse for H. By the inverse function theorem, we have inverses everywhere locally provided H' is nonzero, which follows from H'g=1, and provided H is continuously differentiable; since we're (I take it) working in one real dimension here, everywhere locally => globally. So, actually, the condition above *is* enough, if we require g to be continuous. Oh, and I guess we'd better amend "contains B" to "contains an open set containing B" or "has interior containing B". - That taking g(y(x)) isn't a range error. It seems like it might be nontrivial to control the range of y, so let's just require that g be defined on all of R. - Er, that seems to be it. So the theorem seems to be something like this: Let A be any open subset of R, and suppose f : A -> R has an antiderivative F. Write B := f(A). Now suppose g : R -> R\{0} is continuous, and suppose 1/g has an antiderivative H. Then, provided the interior of the image of H contains B, we may write y(x) = H^{-1}(F(x)) and there is guaranteed to be a continuously differentiable choice of H^{-1}. For this y, we have dy/dx = f(x) g(y). [] What about uniqueness? Well, write z(x) = H(y(x)). (Assumption: H is defined on the image of y. Let's actually suppose that H is defined everywhere. Oh look, we already did.) Then we have z'(x) = H'(y(x)) y'(x) = y'(x) / g(y(x)), and so if y'(x) = f(x) g(y) then z'(x) = f(x) and so z is an antiderivative of f. In other words, we did in fact construct all solutions. Those seem to be pretty much the weakest conditions we could get away with. Can we simplify them at some small cost in generality? Well, if g is bounded away from 0 then the image of H is guaranteed to be all of R, so in this case we get: Let A be an open subset of R and f : A -> R have an antiderivative F. And suppose g : R -> R is continuous and bounded away from 0, and suppose 1/g has an antiderivative H. Then writing y = H^{-1}(F(x)), we have y' = f(x) g(y) on A, and every solution to that differential equation has this form. Throw in the fact that a differentiable function whose derivative is 0 on an open set is constant on that set, and make A connected (hence an interval), and we can reformulate that as: make one choice of F,H and then the general solution is obtained by using F+c in place of F. This all seems rather pedestrian. You probably had something subtler in mind... -- g