Re: [math-fun] Linear algebra over finite fields ?
The paper that Victor linked to is on the singular value decomposition (SVD) for finite fields. One reason I like the SVD, at least over the reals, is its connection to the linear geometry of Euclidean space. Suppose P and Q are each a k-dimensional linear subspace of R^n, with k <= n. How can we describe the "angle" between P and Q ? We can ask for the unit vectors u in P and v in Q whose least nonnegative angle ang_1 = Ang(u,v) is minimized over all choices of unit vectors u and v. But wait, there's more. Now consider P_1 = u^perp and Q_1 = v^perp, the orthogonal complements of u in P and v in Q, and lather, rinse, repeat. This way we get a well-defined sequence of angles: 0 <= ang_1 <= ang_2 <= ... ang_k <= π/2 such that the vector (ang_1, ang_2, ..., ang_n) uniquely characterizes the pair of k-planes P and Q in R^n up to an isometry of R^n. That is, if P' and Q' have the same angle-vector as P and Q, then there's an orthogonal matrix J in the group O(n) such that J(P) = P' and J(Q) = Q'. A convenient way to calculate these angles is with the SVD. First choose any orthonormal bases {u_j} for P and {v_j} for Q, and now let the k x k matrix M be defined as M_ij = <u_i, v_j>, the dot product of u_i and v_j. The singular values SVD(M) of M are then the *cosines* of the angles ang_k. This is independent of the choices of orthonormal bases (very short proof). * * * So, if we can get real eigenvalues from matrices over finite fields F, then maybe it makes sense to talk about angles between subspaces of a vector space over F. —Dan Victor Miller écrit: ----- Look here: https://arxiv.org/pdf/1805.06999.pdf Henry Baker <hbaker1@pipeline.com> écrit: ----- ...
"So you've heard all this cool stuff about matrix factorization (eigenvalue decomp, SVD, polar decomp, etc.) over the reals & complex numbers; what happens to all of this stuff when you move over to the Galois fields?" ...
-----
participants (1)
-
Dan Asimov