here is a mathematical question. in trying to convince a class where linear differential equations are being solved that it is obvious that the derivative of the determinant of the solution is the trace of the coefficient matrix (the theorem is known; the discussion centers on whether it is obvious) there is a step whereby the derivative of a determinant is a sum of determinants wherein the columns are differentiated, one by one. Since the columns are vectors, their derivative is just a multiple by the coefficient matrix, leaving a sum of determinants in which the columns are multiplied by a matrix factor, one by one. Somehow this asssemblage acquires an invariant of the coefficient matrix as a factor, namely the trace, which is the result. Recalling that the determinant of a product is the product of determinants, this can be rewritten as a determinant of columns in which each column is multiplied by the coefficient, an again an invariant appears as a coevvicient, namely the determinant. These are two extreme cases. Suppose that two columns are multiplied by matrices, then these determinants summed over all pairs of columns. Will this give as a coefficient the second invariant, namely the sum of the diagonal 2x2 minors? Looking in Google, the rule for differentiating a determinant is attributed to Jacobi, although it is not so hard to deduce from the sum-of-products definition of a determinant and this is cast in the form of the formula for the inverse of a matrix using the adjugatge with the determinant sitting there as a factor where it can be differentiated. Is there some lore of determinant theory that we don't know about which contains results such as these, and possibly the answer to the question about invariants? - hvm ------------------------------------------------- Obtén tu correo en www.correo.unam.mx UNAMonos Comunicándonos