Re Dan's questions: As I pointed out in my original post, this problem can be solved with SVD (Singular Value Decomposition) of the input matrix. The answer is the first rows of the left & right SVD factors. The other rows of these orthogonal factors correspond to the non-principle singular values, which would be zero with no roundoff error, and extremely small with round-off error. What I wanted was a very simple (and hopefully completely *rational* -- not possible with SVD) solution (which I have demonstrated). At 06:39 PM 2/3/2021, Dan Asimov wrote:
----- Find the (real) vectors U,V whose outer product makes the (real) matrix; i.e., U' V = M Clearly, for any real alpha, (U/alpha)' (alpha*V) = M so the vectors are determined only up to a single constant factor. ----- I'm probably missing the obvious, but I wondered about that.
It's true that Henry is starting from the assumption that there exists a solution.
But instead: Suppose we want to try to solve for x, y, z, w the equation (A B) (x y)' (z w) = ( ) (C D)
I.e., the four equations xz = A, xw = B, yz = C, yw = D for randomly chosen A, B, C, D (real or complex).
When does there exist a solution?
And what about the general case in higher dimensions:
Let v, w be unknown vectors in K^n (K = R or C) with v' w = M, an arbitrary n x n matrix of constants in K.
When does there exist a solution for v and w ???
ÂDan