From: Henry Baker <hbaker1@pipeline.com>
One of my intuitions re numbers is that analysts worry about the high-order bits of numbers, while number theorists worry about the low-order bits. For example, some quick compositeness tests look only at the low-order digits of numbers.
Name two such tests please? "Is the number even" comes to mind (assuming that when you say 'digits' after you've previously said 'bits' that you did in fact mean 'bits' again), but I'm not really getting a second test. Would you care to demonstrate using the numbers 0xFFFCA5000000000000000000000000000000000000000000000000000000000000000001 0xFFFC8B000000000000000000000000000000000000000000000000000000000000000001 ? A "but I did actually mean digits" wiggle should be avoided.
The usual GCD algorithm worries about reducing the length of the numbers, but the so-called "binary algorithm" (see Knuth if you need to know more about this) worries about reducing the number of non-zero bits in the numbers.
No it doesn't. Unless you'd like to explain to me how gcd(0b10010001,0b100000101) = 0b11101 involves reducing the number of non-zero bits in the numbers. It simply removes an extremal set bit, not caring what happens to any other bits. In particular, the lowest set bit. The traditional method ineffectually (depending on quite which version you use) attempts to remove a top set bit. (The whole binary algorithm can be summarised as "get rids of 0s, then get rid of the lowest 1, lather, rinse, repeat".) Phil () ASCII ribbon campaign () Hopeless ribbon campaign /\ against HTML mail /\ against gratuitous bloodshed [stolen with permission from Daniel B. Cristofani] __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com