[math-fun] cubic quaternions
I've been wondering about doing a cubic analog of the quaternions. The idea is to have two generators Q and R with Q = cbrt(q) and R = cbrt(r), and the non-commutative multiplication rule R*Q = w Q*R (where w = cbrt(1) = (-1 + i sqrt3)/2 = e^(2 pi i/3)). For definiteness, I'm imagining q=2 and r=3. The result seems to be a nine-dimensional space, linear combinations of 1, Q, Q^2, R, R^2, Q R, Q^2 R, Q R^2, Q^2 R^2, with coefficients of the shape a + b w. The type of construction seems to guarantee associativity, needing only to check things like R Q^3 = Q^3 R -- required, since Q^3 is in the ground field which should commute with R, and true, since we pick up three factors of w when we move the R through the Qs while sorting the factors. There are some nice properties like (Q+R)^3 = Q^3 + R^3, which happens because the cross terms like QRR come in three orders and when reordered and collected have coefficient 1+w+w^2 = 0. I haven't proved "no zero divisors", which is an important theorem in the quaternion case. Or worked out the Norm formula, which is needed to compute reciprocals. One proof of the Four Square Theorem (every number is the sum of four squares) uses quternion arithmetic. Perhaps there's a proof of the Nine Cube Theorem lurking somewhere (with q=r=1 ?). Has anyone seen this stuff before? Rich rcs@cs.arizona.edu
Mensaje citado por: Richard Schroeppel <rcs@CS.Arizona.EDU>:
I've been wondering about doing a cubic analog of the quaternions. [...] Has anyone seen this stuff before?
yes ... I wrote an article long ago, "On Matrices which Anticommute with a Hamiltonian," and later on there was another, "Summetry Adapted Functions Belonging to the Dirac Groups." I won't attempt references because they were in relatively obscure journals, and thare are both earlier and better references. Cayley, for example, soon after Hamilton. The interesting idea lying behind all this is the question of finding a basis for matrices, given a basis for their vector space. The obvious basis consists of matrices Eij whose elements are delta(i,m)delta(j,n) and for which any matrix is sum[m(i,j)Eij]. After the Cayley-Hamilton Theorem and the spectral theorem, it is natural to think of column-row products of left and right eigenvectors, which are transforms of the Eij in the coordinate system which diagonalizes some given matrix, allowance being made for degeneracy and incompleteness. All this is standard linear algebra lore. Significant aspects are that matrices which commute with the referent have the same eigenvectors, and that preserving eigenvalues is done by intertwining. The C-H theorem sets up half a basis for matrices because those which commute with the referent are polynomials in the referent modulo the characteristic polynomial. What would be a good choice for the other half of the basis? One choice would be to take a second referent together with its poly- nomials, giving a basis of products of powers. But noncommutativity means there are far too many mixed powers, although they can be reduced to canonical form with structure constants. However, the structure constants are greatly simplified if the referent pair satisfy an exchange relation A B = omega B A with a scalar omega. For finite matrices, omega is a root of unity; for infinite matrices the slightly different form A B = B A + C leads to ladder operators, coherent states, and such like. For finite matrices, omega = -1 is a nice choice and leads to quaternions, although for high dimensional spaces you get gamma matrices and in general it is finding irreducible representations of the "Dirac groups" that has to be worked out. For omega = (primitive cube root of unity), you get an approach to Richard's question. - hvm ------------------------------------------------- Obtén tu correo en www.correo.unam.mx UNAMonos Comunicándonos
participants (2)
-
mcintosh@servidor.unam.mx -
Richard Schroeppel