For all practical purposes, using a probabilistic primality test, e.g., Miller-Rabin, is sufficient, perhaps after trying some small prime divisors for extra efficiency. You just need a good [non-deterministic] random number generator. You will quickly reduce the false-positive rate to below your computer's error probability. On 27-Sep-16 15:41, Andy Latto wrote:
On Tue, Sep 27, 2016 at 11:51 AM, Henry Baker <hbaker1@pipeline.com> wrote:
So what will be the impact of this? I don't think it will have any practical or cryptographic effects, though it may lead to other faster algorithms that do, or other theoretically interesting algorithm improvements. Will we see cheaper, lower-power encryption devices? While many encryption algorithms start with "find a couple of large primes", I don't think this is a significant part of the running time of encryption algorithms. You can find candidates quickly by dividing by a few small divisors, then checking that 2^(p-1) = 1 mod p, and then only doing a full primality test for those that aren't ruled out by these two fast tests, which will have you doing at most a couple of full primality tests. Eratosthsenes' s sieve or improvements on it are good for finding all the primes in a range fast, but even with the improvements are slow compared to testing a single number for primality. Or maybe quicker cracking times in brute force attacks? As far as I know, attacks care about fast factoring, not fast primality testing or finding all of the primes in a range.
Andy.latto@pobox.com
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun