Hello everybody, There is a spectral decomposition theorem about real normal matrices (AA^T = A^T A) which says that there is an orthogonal matrix P and a block diagonal matrix D such that A = P^ D P, where the blocks of D are either a scalar (real), or a 2 x 2 matrix of the form a -b b a with b > 0. So indeed this amounts to finding an orthogonal basis and in some pairwise orthogonal planes, A behaves like a rotation composed with a positive scaling. In the special case of an orthogonal matrix, the scalars are +1 or -1 and the 2 x 2 matrices are 2D rotation matrices cos theta -sin theta sin theta cos theta 0 < theta < pi. The corresponding vectors in P give you a plane. The angle theta correspond to the two eigenvalues cos theta + i sin theta and cos theta - i sin theta. If det(A) = 1, then the number of -1 is even, and you can group them together in 2 x 2 matrices that correspond to 2D rotations of angle pi. I don’t know how old the theorem is. It know that it is proven in Gantmaker (Theory of matrices), Berger (Geometry I), and I have a version in my recent book https://www.amazon.com/Algebra-Optimization-Applications-Machine-Learning/dp... <https://www.amazon.com/Algebra-Optimization-Applications-Machine-Learning/dp/9811207712> Please email me if you want a pdf. Best, — Jean