A standard method taught in calculus classes is the method of solving separable differential equations; namely, we solve dy/dx = f(x) g(y) by writing it as (1/g(y)) dy = f(x) dx and then writing / / \ (1/g(y)) dy = \ f(x) dx / / But does anyone know of an appropriately general THEOREM that validates this method? That is, can anyone point me to a theorem (whose statement should be accessible to bright first-year college students) saying that if f and g are functions satisfying certain conditions, then there is a one-parameter family of solutions to dy/dx = f(x) g(y) obtained by the formula y = H^{-1}(F(x)+C), where F is an antiderivative of f, C is an arbitrary constant, and H is an antiderivative of 1/g(y)? Note that if the antiderivative of 1/g(y) isn't invertible, then we're in trouble since H^{-1} won't exist. Though now that I think of it, that won't happen if g(y) is continuous and stays away from 0. I'd like to tell my students a correct theorem, not just give them a hocus-pocus that works (or usually works). Come to think of it: does anyone know of any examples for which naive application of separation of variables gives wrong results, say by introducing spurious solutions or neglecting valid ones? A correct theorem would attend to domains of existence, and maybe would include a uniqueness claim as well. I've been using one of Stewart's calculus texts, and on the whole I've approved of his compromises between the need for accessibility and the need for rigor, but on the topic of separable differential equations he seems to be throwing up his hands and just saying "Here's how you solve it, kids". Jim Propp