Re: [math-fun] Elementary (?) matrix question
Jean writes: << The non-negative square roots, sigma_1, ..., sigma_n, of the eigenvalues of M.M' are indeed known as the singular values of M. The Singular Value DecompositionTheorem (SVD) says that M = U Sigma V', for some unitary matrices, U , V , and where Sigma = diag(sigma_1, ..., sigma_n) (I assume M is an n X n matrix).
The Singular Value Decomposition has an interesting interpretation over the reals. Suppose L_1 and L_2 are linear n-dimensional subspaces of R^k. If n = 1 their relative position in R^k is described by one number, the angle between these two lines (or its cosine). Let {u_1,...u_n} and {v_1,...v_n} be orthonormal bases for L_1 respectively L_2 in R^k. Let U and V be the k x n matrices whose columns are the u_j's respectively the v_j's. Then the n x n matrix A := U^t V has its ij^th entry just <u_i, v_j> (dot product). The singular value decomposition of A is SVD = P^t A Q, where SVD is a diagonal matrix of nonnegative entries in descending order, and P and Q are orthogonal. This defines SVD uniquely. Back to the geometry of L_1 and L_2: Among all the unit vectors x in L_1 and y in L_2, there is an x and y that maximize the dot product <x,y>, i.e., minimize their angle. (It may happen that the angle is 0.) This dot product will be the first entry of the SVD matrix S. Now let L_1' and L_2' be the orthogonal complements of x respectively y in L_1 and L_2. Again let x' and y' be the vectors of L_1' and L_2' respectively having the least angle. Then <x',y'> is the second entry of SVD. Etc. The sequence of vectors x, x', x'',... is orthonormal, as is y, y', y'',.... The x's are given by the columns of the n x k matrix VQ, and likewise for the y's and UP. The sequence of (nonnegative) cosines, i.e., the singular values, on the diagonal of SVD completely characterize the relative positions of L_1 and L_2 up to an isometry of R^k. The angles (inverse cosines of the singular values) are called the principal angles of L_1 and L_2. --Dan _____________________________________________________________________ "It don't mean a thing if it ain't got that certain je ne sais quoi." --Peter Schickele
On 1/5/10, Dan Asimov <dasimov@earthlink.net> wrote:
... The Singular Value Decomposition has an interesting interpretation over the reals. Suppose L_1 and L_2 are linear n-dimensional subspaces of R^k. If n = 1 their relative position in R^k is described by one number, the angle between these two lines (or its cosine). ... The sequence of (nonnegative) cosines, i.e., the singular values, on the diagonal of SVD completely characterize the relative positions of L_1 and L_2 up to an isometry of R^k. The angles (inverse cosines of the singular values) are called the principal angles of L_1 and L_2.
As an algorithm for isolating the relationship between two subspaces --- possibly of different dimensions --- the method given earlier faces a number of problems, the most inconvenient being that SVD involves solving a polynomial equation. In fact, this task may be accomplished entirely algebraically, via geometric algebra as follows. The m-space transcendental Clifford generators are antisymmetric square roots of unity z_1,...,z_m (pace Dan), where for 1 <= i < j <= m, z_i z_i = 1 , z_j z_i = - z_i z_j . The vector V = (a_1, ..., a_m) is identified with the linear polynomial V -> a_1 z_1 + ... + a_m z_m ; the subspace L spanned by V_1, ..., V_k with the (Grassman) exterior product L -> < V_1 ... V_k >_k , found by taking the Clifford product V_1 ... V_k, then discarding all terms but those of degree k. Denote by L^+ the "reversion" --- essentially a pseudo-inverse --- of L, computed as follows: if L = <L>_0 + <L>_1 + <L>_2 + <L>_3 + <L>_4 + ... be the decomposition of L as the sum of homogeneous sub-polynomials <L>_j of degree j, then L^+ = <L>_0 + <L>_1 - <L>_2 - <L>_3 + <L>_4 + ... ; denote by ||L|| the scalar "magnitude" --- here, squared length --- ||L|| = L^+ L . Given two subspaces L,M of dimensions k,l with k <= l, their full Clifford product N = L^+ M represents an isometry taking L into its reflection in M via conjugation: N^+ L N = M^+ L M (ignoring a scalar factor). Now set c_i = ||<N>_(l+k-2i)|| ; the roots of the polynomial c_0 + c_1 S^2 + ... + c_j S^(2k) are the squares of the tangents of the principal angles between L,M. Of course, if we needed the angles themselves, we should have to solve this equation; but if we just need a unique coordinate representation of the relative position, the equation coefficients provide it --- even in cases where the initial components are not themselves numerical. Fred Lunnon
participants (2)
-
Dan Asimov -
Fred lunnon