One of the standard parts of any algebra course shows that every "symmetric function" (actually, a symmetric polynomial) can be expressed as polynomial in various different basis symmetric polynomials, including the "elementary symmetric polynomials" [coefficients of product(z-ri)], the "power basis" polynomials [sum(ri^n)], etc. But this doesn't satisfy me, because I want to compute other "symmetric functions" of the roots of a polynomial. For example, if p(z)=(z-r1)*(z-r2)*(z-r3) is a cubic polynomial with *complex* roots, I'd like to be able to compute the area(r1,r2,r3)^2 of the triangle whose vertices are r1,r2,r3 in the complex plane. I don't seem to be able to do this, because the triangle area seems to require the *conjugate* operation in addition to the usual +,-,* operations. One elegant approach to the *power basis* comes from powers of the *companion matrix* for the polynomial. If C is the companion matrix for the polynomial p(z) above, then C is diagonalized by the *Vandermonde* matrix of the roots of p(z), and the eigenvalues of C are the roots r1,r2,r3 of the polynomial p(z). V^(-1).C.V = diag(r1,r2,r3) But (V^(-1).C.V)^n = diag(r1,r2,r3)^n, so C^n = V.diag(r1^n,r2^n,r3^n).V^(-1). Note that C^n can trivially be computed by rational operations -- i.e., without solving for r1,r2, r3. This means that the characteristic polynomial of (C^n) will provide the expansion of the powers of the roots in terms of the elementary symmetric polynomials of r1,r2,r3. In fact, for any polynomial q(z) (or indeed any convergent power series q[z]), we can compute q(C) (or q[C]), and produce another cubic polynomial whose roots are q(r1), q(r2),q(r3) (q[r1],q[r2],q[r3]). In particular, we could compute approximations to the matrix sin(C) whose characteristic polynomial roots are sin(r1), sin(r2), sin(r3), or exp(C) whose characteristic polynomial roots are exp(r1), exp(r2),exp(r3), or even log(C), whose characteristic polynomial roots are log(r1),log(r2),log(r3). I suspect that Newton himself would have found this insight to be very cool! We can also rationally compute a new polynomial ps(z) whose roots are (r1+r2),(r2+r3),(r3+r1), and also a new polynomial pp(z) whose roots are r1*r2, r2*r3, r3*r1. Since (r1-r2) is not symmetric, we can square it and rationally produce a polynomial pdiff(z) whose roots are (r1-r2)^2, (r2-r3)^2, and (r3-r1)^2. But none of these methods enables the calculation of area(r1,r2,r3)^2, which is a positive real number and is a "symmetric function" of r1,r2,r3. Triangle areas seem to require the introduction of *conjugates*, so we have symmetric functions such as abs(r1)^2+abs(r2)^2+abs(r3)^2 = r1r^2+r1i^2+ r2r^2+r2i^2+r3r^2+r3i^2. For the life of me, I haven't been able to find symmetric functions using +,-,*,conjugate that compute symmetric functions such as this length, or the areas of triangles. But neither have I been able to prove that it can't be done. I've searched the literature online, but haven't been able to find anything which treats this question. One thought is to compute the *polar decomposition* of the companion matrix C = U.P, where U is unitary and P is positive semidefinite. Then (C*).C = (P*).U*.U.P = (P*).P. But the problem is that the eigenvalues of C=U.P aren't the same as the eigenvalues of P, so we would need some way to relate the eigenvalues of P to those of C. In general, I'd like to see as many of the questions about polynomial roots and coefficients be put into *matrix form* whenever possible, even though that doesn't follow the history of polynomial roots very accurately. Similarly, the SVD of C doesn't help, either, because the unitary matrices for conjugate(C) aren't the same as those for C itself. Any suggestions?