From: Henry Baker <hbaker1@pipeline.com>
One of my intuitions re numbers is that analysts worry about the high-order bits of numbers, while number theorists worry about the low-order bits. For example, some quick compositeness tests look only at the low-order digits of numbers.
Name two such tests please? "Is the number even" comes to mind (assuming that when you say 'digits' after you've previously said 'bits' that you did in fact mean 'bits' again), but I'm not really getting a second test. Would you care to demonstrate using the numbers 0xFFFCA5000000000000000000000000000000000000000000000000000000000000000001 0xFFFC8B000000000000000000000000000000000000000000000000000000000000000001 ? A "but I did actually mean digits" wiggle should be avoided.
The usual GCD algorithm worries about reducing the length of the numbers, but the so-called "binary algorithm" (see Knuth if you need to know more about this) worries about reducing the number of non-zero bits in the numbers.
No it doesn't. Unless you'd like to explain to me how gcd(0b10010001,0b100000101) = 0b11101 involves reducing the number of non-zero bits in the numbers. It simply removes an extremal set bit, not caring what happens to any other bits. In particular, the lowest set bit. The traditional method ineffectually (depending on quite which version you use) attempts to remove a top set bit. (The whole binary algorithm can be summarised as "get rids of 0s, then get rid of the lowest 1, lather, rinse, repeat".) Phil () ASCII ribbon campaign () Hopeless ribbon campaign /\ against HTML mail /\ against gratuitous bloodshed [stolen with permission from Daniel B. Cristofani] __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com
At 04:01 PM 4/19/2006, Phil Carmody wrote:
From: Henry Baker <hbaker1@pipeline.com>
One of my intuitions re numbers is that analysts worry about the high-order bits of numbers, while number theorists worry about the low-order bits. For example, some quick compositeness tests look only at the low-order digits of numbers.
Name two such tests please? "Is the number even" comes to mind (assuming that when you say 'digits' after you've previously said 'bits' that you did in fact mean 'bits' again), but I'm not really getting a second test. Would you care to demonstrate using the numbers 0xFFFCA5000000000000000000000000000000000000000000000000000000000000000001 0xFFFC8B000000000000000000000000000000000000000000000000000000000000000001 ? A "but I did actually mean digits" wiggle should be avoided.
The operative word here is "intuition". Yes, I did mean low order thingy's -- bits, digits, whatevers. If the low-order decimal digit is 2, 4, 6, 8, 0, then the number is divisible by 2; if the low-order decimal digit is 5, 0, then the number is divisible by 5. If you take a larger number of low-order digits, you can exclude more composites. I got this intuition from the types of equations that linear Diophantine analysis can solve -- the things that seem to be important aren't the sizes of the integers, but properties that analysts tend to wash away when the numbers are approximated -- i.e., information stored in the low-order bits.
The usual GCD algorithm worries about reducing the length of the numbers, but the so-called "binary algorithm" (see Knuth if you need to know more about this) worries about reducing the number of non-zero bits in the numbers.
No it doesn't. Unless you'd like to explain to me how gcd(0b10010001,0b100000101) = 0b11101 involves reducing the number of non-zero bits in the numbers.
It simply removes an extremal set bit, not caring what happens to any other bits. In particular, the lowest set bit. The traditional method ineffectually (depending on quite which version you use) attempts to remove a top set bit. (The whole binary algorithm can be summarised as "get rids of 0s, then get rid of the lowest 1, lather, rinse, repeat".)
You're being too literal. I'm talking about intuition, here. The GCD algorithm is remarkably robust to a large number of variations, some of which converge faster than others. You can mash two numbers together, and as long as you make "progress" in some dimension -- one of the fastest is the reduce the "length" of the numbers, where "length" is defined to be the distance between the high-order bit/digit and the low-order bit/digit -- you will eventually terminate with the GCD. The so-called "binary" GCD in Knuth is merely an extreme along one of these dimensions; the "usual" (Euclidean) GCD being an extreme along another dimension.
Phil
() ASCII ribbon campaign () Hopeless ribbon campaign /\ against HTML mail /\ against gratuitous bloodshed
[stolen with permission from Daniel B. Cristofani]
The only composites you can eliminate by looking at the low order digits, in any base, are those divisible by the primes dividing the base; and if you supplement with checking that the number is not itself equal to that prime, only the single low order digit need be looked at. Looking at more low order digits lets you determine divisibility by higher powers of the those primes, but not anything really new. Franklin T. Adams-Watters -----Original Message----- From: Henry Baker hbaker1@pipeline.com The operative word here is "intuition". Yes, I did mean low order thingy's -- bits, digits, whatevers. If the low-order decimal digit is 2, 4, 6, 8, 0, then the number is divisible by 2; if the low-order decimal digit is 5, 0, then the number is divisible by 5. If you take a larger number of low-order digits, you can exclude more composites. ___________________________________________________ Try the New Netscape Mail Today! Virtually Spam-Free | More Storage | Import Your Contact List http://mail.netscape.com
participants (3)
-
franktaw@netscape.net -
Henry Baker -
Phil Carmody