28 Jan
2012
28 Jan
'12
8:41 a.m.
Which made me curious... What is the smallest value a(d) such that every d-digit sequence occurs in the decimal expansion of 1/d (after the decimal point). For small d >= 1 my program says 17, 109, 1019, 10007, 100019, ... but it seems stuck on 6 digits, so perhaps there is an overflow issue. All primes slightly above 10^d, as one might expect, but not necessarily the smallest such primes.