There is a formula giving the distance between two disjoint affine subspaces involving Gram determinants. Recall that if e_1, ..., e_m are vectors in R^n (m <= n) then Gram(e_1, .., e_m) = det(<e_i, e_j>), where <e_i, e_j> is the inner product of e_i and e_j. If U_1 and U_2 are disjoint affine subspaces, let V_1 and V_2 be their linear directions, that is, unique vector spaces such that U_1 = a_1 + V_1 and U_2 = a_2 + V_2, for any a_1\in U_1 and any a_2\in U_2. Pick any basis e_1, ..., e_m of V_1 + V_2 , then the square of the distance d(U_1, U_2) between U_1 and U_2 is given by d(U_1, U_2)^2 = Gram(a_1 - a_2, e_1, ..., e_m)/Gram(e_1, ..., e_m). In the special case where U_1 and U_2 are skew lines (i.e., not parallel), we also have the formula d(U_1, U_2)^2 = Gram(a - a', a - b, a' - b')/Gram(a - b, a' - b'), where a, b are any two distinct points on U_1 and a', b' any two distinct points on U_2. The above formula can be found in Berger, Geometry I, Chapter 9, Section 2. I found the more general formula in "Methodes Modernes en Geometrie", by Jean Fresnel, Part C, Section 1.4.2. The proof is not entirely trivial. See also Problem 7.13 of my "Geometric Methods and Applications", TAM 38, Springer-Verlag. Best, -- Jean Gallier