Before log tables were invented, there were a couple of other tricks that were useful for multiplication. One way was to use trig identities like cosX * cosY = (cos(X+Y) + cos(X-Y))/2. Another was to use a table of triangular numbers, T(x) = x(x+1)/2 x * y = T(x+y) - T(x) - T(y) or Quarter-Squares, Q(N) = floor[N^2 / 4] x * y = Q(x+y) - Q(x-y). The key idea in each case is to use a table that's a function of one variable in a clever way to compute a two-variable result. (I'm skeptical about these actually being used much, since the required tables are still pretty big. But I've never actually looked over Mr. Fibonacci's shoulder.) I'm looking for an appropriate generalization for GF2 polynomial multiplication. (Multiplication of polynomials where the coefficients are reduced mod 2.) Each of the methods above uses division by 2, so some minor variation is necessary. There are about six different ways to do GF2 polynomial products. I'm wondering if the scheme below is new. Define F(X), a kind of T-surrogate, as follows. Represent X as a block of bits: X = t^3 + t^2 + 1 --> 1101. For each pair of 1 bits, accumulate the product (using xor), into a total. This total is F(X). In the example, the terms to be xored are 1000 * 1, 100 * 1, and 1000 * 100. The total is 101100. (Representing t^5 + t^3 + t^2.) Then it appears that X * Y = (X&Y)^2 + F(XvY) + F(X&Y) + F(X&~Y) + F(Y&~X). The logic operators & v ~ are to be used on right-justified bit patterns. The squaring operation is fast in GF2, simply spread out the bits, interleaving with 0s. (Squaring is a linear operator in GF2.) Has anyone seen anything like this? Rich