[math-fun] multiplication/division over GF(2^k)
In this day & age of fast CPU's & GPU's, is there any hardware support for multiplication/division over GF(2^k)? Alternatively, can the instructions for integer multiplication/division be "hacked" (in the HAKMEM sense of the word) to do multiplication/division over GF(2^k) relatively efficiently? I'd be happy with multiplying arbitrary polynomials with coefficients in GF(2)=Z/2Z, and long division of such polynomials to produce a quotient & a remainder polynomial.
* Henry Baker <hbaker1@pipeline.com> [May 27. 2012 18:14]:
In this day & age of fast CPU's & GPU's, is there any hardware support for multiplication/division over GF(2^k)?
CPU: the new CLMUL instruction, see http://en.wikipedia.org/wiki/CLMUL_instruction_set GPU: None that I know about and I'd be surprised if this would even be considered.
Alternatively, can the instructions for integer multiplication/division be "hacked" (in the HAKMEM sense of the word) to do multiplication/division over GF(2^k) relatively efficiently?
I'd be surprised if so (all I could came up with a few years ago was so bad that I happily forgot it). I came up only with the rather straight forward stuff now given in the fxtbook (but have seen slightly better routines by Paul Zimmermann (et. al?), my slight improvement for one of these had no effect because the compiler's optimizer already grokked it).
I'd be happy with multiplying arbitrary polynomials with coefficients in GF(2)=Z/2Z, and long division of such polynomials to produce a quotient & a remainder polynomial.
I recently noticed that the current version of Sage seems to be _very_ efficient with operations in GF(2^n). (much less in other small characteristic fields). Pari/gp is fine as well. Both do efficient _modular_ ops, possibly using Motngomery multiplication.
Re GPU's: With Bill Dally as Chief Scientist at nVidia, one might expect CLMUL to show up in future GPU's... At 10:48 AM 5/27/2012, Joerg Arndt wrote:
* Henry Baker <hbaker1@pipeline.com> [May 27. 2012 18:14]:
In this day & age of fast CPU's & GPU's, is there any hardware support for multiplication/division over GF(2^k)?
CPU: the new CLMUL instruction, see http://en.wikipedia.org/wiki/CLMUL_instruction_set
GPU: None that I know about and I'd be surprised if this would even be considered.
participants (2)
-
Henry Baker -
Joerg Arndt