[math-fun] Space for Sieve of Eratosthenes ~ O(n^(1/3))
FYI -- Can anyone provide a 30-second explanation of this algorithm? https://science.slashdot.org/story/16/09/26/2128254/researcher-modifies-siev... Researcher Modifies Sieve of Eratosthenes To Work With Less Physical Memory Space (scientificamerican.com) Posted by BeauHD on Monday September 26, 2016 @07:30PM from the ones-and-zeros dept. grcumb writes: Peruvian mathematician Harald Helfgott made his mark on the history of mathematics by solving Goldbach's weak conjecture, which states that every odd number greater than 7 can be expressed as the sum of three prime numbers. Now, according to Scientific American, he's found a better solution to the sieve of Eratosthenes: "In order to determine with this sieve all primes between 1 and 100, for example, one has to write down the list of numbers in numerical order and start crossing them out in a certain order: first, the multiples of 2 (except the 2); then, the multiples of 3, except the 3; and so on, starting by the next number that had not been crossed out. The numbers that survive this procedure will be the primes. The method can be formulated as an algorithm." But now, Helfgott has found a method to drastically reduce the amount of RAM required to run the algorithm: "Now, inspired by combined approaches to the analytical 100-year-old technique called the circle method, Helfgott was able to modify the sieve of Eratosthenes to work with less physical memory space. In mathematical terms: instead of needing a space N, now it is enough to have the cube root of N." So what will be the impact of this? Will we see cheaper, lower-power encryption devices? Or maybe quicker cracking times in brute force attacks? Mathematician Jean Carlos Cortissoz Iriarte of Cornell University and Los Andes University offers an analogy: "Let's pretend that you are a computer and that to store data in your memory you use sheets of paper. If to calculate the primes between 1 and 1,000,000, you need 200 reams of paper (10,000 sheets), and with the algorithm proposed by Helfgott you will only need one fifth of a ream (about 100 sheets)," he says. http://www.scientificamerican.com/article/new-take-on-an-ancient-method-impr...
From the professor's comment on slashdot, it appears it's an incremental sieve that works on the span [N, N+N^(1/3)] and, rather than consider all potential prime factors < N^(1/2), it figures out which potential prime factors cannot possibly have multiples in the range [N, N+N^(1/3)] (which is the great majority).
-tom On Tue, Sep 27, 2016 at 8:51 AM, Henry Baker <hbaker1@pipeline.com> wrote:
FYI -- Can anyone provide a 30-second explanation of this algorithm?
https://science.slashdot.org/story/16/09/26/2128254/ researcher-modifies-sieve-of-eratosthenes-to-work-with- less-physical-memory-space
Researcher Modifies Sieve of Eratosthenes To Work With Less Physical Memory Space (scientificamerican.com)
Posted by BeauHD on Monday September 26, 2016 @07:30PM from the ones-and-zeros dept.
grcumb writes: Peruvian mathematician Harald Helfgott made his mark on the history of mathematics by solving Goldbach's weak conjecture, which states that every odd number greater than 7 can be expressed as the sum of three prime numbers.
Now, according to Scientific American, he's found a better solution to the sieve of Eratosthenes:
"In order to determine with this sieve all primes between 1 and 100, for example, one has to write down the list of numbers in numerical order and start crossing them out in a certain order: first, the multiples of 2 (except the 2); then, the multiples of 3, except the 3; and so on, starting by the next number that had not been crossed out. The numbers that survive this procedure will be the primes. The method can be formulated as an algorithm."
But now, Helfgott has found a method to drastically reduce the amount of RAM required to run the algorithm:
"Now, inspired by combined approaches to the analytical 100-year-old technique called the circle method, Helfgott was able to modify the sieve of Eratosthenes to work with less physical memory space. In mathematical terms: instead of needing a space N, now it is enough to have the cube root of N."
So what will be the impact of this?
Will we see cheaper, lower-power encryption devices?
Or maybe quicker cracking times in brute force attacks?
Mathematician Jean Carlos Cortissoz Iriarte of Cornell University and Los Andes University offers an analogy:
"Let's pretend that you are a computer and that to store data in your memory you use sheets of paper. If to calculate the primes between 1 and 1,000,000, you need 200 reams of paper (10,000 sheets), and with the algorithm proposed by Helfgott you will only need one fifth of a ream (about 100 sheets)," he says.
http://www.scientificamerican.com/article/new-take-on-an- ancient-method-improves-way-to-find-prime-numbers/
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
-- -- http://cube20.org/ -- [ <http://golly.sf.net/>Golly link suppressed; ask me why] --
That sounds a lot like one of the obervations that we (Lagarias, Odlyzko and I) made in the combinatorial algorithm for counting primes: http://www.ams.org/mcom/1985-44-170/S0025-5718-1985-0777285-5/S0025-5718-198... Victor On Tue, Sep 27, 2016 at 11:57 AM, Tomas Rokicki <rokicki@gmail.com> wrote:
From the professor's comment on slashdot, it appears it's an incremental sieve that works on the span [N, N+N^(1/3)] and, rather than consider all potential prime factors < N^(1/2), it figures out which potential prime factors cannot possibly have multiples in the range [N, N+N^(1/3)] (which is the great majority).
-tom
On Tue, Sep 27, 2016 at 8:51 AM, Henry Baker <hbaker1@pipeline.com> wrote:
FYI -- Can anyone provide a 30-second explanation of this algorithm?
https://science.slashdot.org/story/16/09/26/2128254/ researcher-modifies-sieve-of-eratosthenes-to-work-with- less-physical-memory-space
Researcher Modifies Sieve of Eratosthenes To Work With Less Physical Memory Space (scientificamerican.com)
Posted by BeauHD on Monday September 26, 2016 @07:30PM from the ones-and-zeros dept.
grcumb writes: Peruvian mathematician Harald Helfgott made his mark on the history of mathematics by solving Goldbach's weak conjecture, which states that every odd number greater than 7 can be expressed as the sum of three prime numbers.
Now, according to Scientific American, he's found a better solution to the sieve of Eratosthenes:
"In order to determine with this sieve all primes between 1 and 100, for example, one has to write down the list of numbers in numerical order and start crossing them out in a certain order: first, the multiples of 2 (except the 2); then, the multiples of 3, except the 3; and so on, starting by the next number that had not been crossed out. The numbers that survive this procedure will be the primes. The method can be formulated as an algorithm."
But now, Helfgott has found a method to drastically reduce the amount of RAM required to run the algorithm:
"Now, inspired by combined approaches to the analytical 100-year-old technique called the circle method, Helfgott was able to modify the sieve of Eratosthenes to work with less physical memory space. In mathematical terms: instead of needing a space N, now it is enough to have the cube root of N."
So what will be the impact of this?
Will we see cheaper, lower-power encryption devices?
Or maybe quicker cracking times in brute force attacks?
Mathematician Jean Carlos Cortissoz Iriarte of Cornell University and Los Andes University offers an analogy:
"Let's pretend that you are a computer and that to store data in your memory you use sheets of paper. If to calculate the primes between 1 and 1,000,000, you need 200 reams of paper (10,000 sheets), and with the algorithm proposed by Helfgott you will only need one fifth of a ream (about 100 sheets)," he says.
http://www.scientificamerican.com/article/new-take-on-an- ancient-method-improves-way-to-find-prime-numbers/
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
-- -- http://cube20.org/ -- [ <http://golly.sf.net/>Golly link suppressed; ask me why] -- _______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
On Tue, Sep 27, 2016 at 11:51 AM, Henry Baker <hbaker1@pipeline.com> wrote:
So what will be the impact of this?
I don't think it will have any practical or cryptographic effects, though it may lead to other faster algorithms that do, or other theoretically interesting algorithm improvements.
Will we see cheaper, lower-power encryption devices?
While many encryption algorithms start with "find a couple of large primes", I don't think this is a significant part of the running time of encryption algorithms. You can find candidates quickly by dividing by a few small divisors, then checking that 2^(p-1) = 1 mod p, and then only doing a full primality test for those that aren't ruled out by these two fast tests, which will have you doing at most a couple of full primality tests. Eratosthsenes' s sieve or improvements on it are good for finding all the primes in a range fast, but even with the improvements are slow compared to testing a single number for primality.
Or maybe quicker cracking times in brute force attacks?
As far as I know, attacks care about fast factoring, not fast primality testing or finding all of the primes in a range. Andy.latto@pobox.com
For all practical purposes, using a probabilistic primality test, e.g., Miller-Rabin, is sufficient, perhaps after trying some small prime divisors for extra efficiency. You just need a good [non-deterministic] random number generator. You will quickly reduce the false-positive rate to below your computer's error probability. On 27-Sep-16 15:41, Andy Latto wrote:
On Tue, Sep 27, 2016 at 11:51 AM, Henry Baker <hbaker1@pipeline.com> wrote:
So what will be the impact of this? I don't think it will have any practical or cryptographic effects, though it may lead to other faster algorithms that do, or other theoretically interesting algorithm improvements. Will we see cheaper, lower-power encryption devices? While many encryption algorithms start with "find a couple of large primes", I don't think this is a significant part of the running time of encryption algorithms. You can find candidates quickly by dividing by a few small divisors, then checking that 2^(p-1) = 1 mod p, and then only doing a full primality test for those that aren't ruled out by these two fast tests, which will have you doing at most a couple of full primality tests. Eratosthsenes' s sieve or improvements on it are good for finding all the primes in a range fast, but even with the improvements are slow compared to testing a single number for primality. Or maybe quicker cracking times in brute force attacks? As far as I know, attacks care about fast factoring, not fast primality testing or finding all of the primes in a range.
Andy.latto@pobox.com
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
participants (5)
-
Andy Latto -
Henry Baker -
Mike Speciner -
Tomas Rokicki -
Victor Miller