[math-fun] Alternatives to Gram-Schmidt
Gram-Schmidt takes a set of vectors, puts them into some arbitrary order, and then "orthogonalizes" them by subtracting off non-orthogonal components. The problem with G-S is that there seems to be little rhyme or reason for the initial sort order. I was curious about trying to *minimize* the *length* of the *description* of the final set of orthogonal vectors. By *description*, I mean the total length in bits of all of the numerators and denominators. My first attempt is a "greedy" algorithm that find the 2 vectors which are at the least orthogonal (greatest |angle-90|) from one another, replace them with the sum and difference of their normalizations, and then rinse and repeat. Note that the sum and difference of same-length vectors are orthogonal. Thus, if one starts with N independent vectors, then it appears that this "greedy" algorithm converges on a mutually orthogonal set. While the convergence isn't fast, it does appear to keep the total description lengths from exploding. The convergence rate appears to be similar to Jacobi/Givens. I would imagine that this algorithm would have interesting behavior when trying to mutually orthogonalize N+1 vectors in an N-dimensional space. It would attempt to find a distribute the vectors over the surface of the N-dimensional unit hypersphere with the arrow heads as far from one another as possible. Ditto for N+2, N+3, etc., vectors.
participants (1)
-
Henry Baker