It's worth noting that there is an interesting crypto problem buried in this discussion that is largely unappreciated. (A low power random number generator would be a good start to a solution). One vision of the future has sensors sprinkled around the environment sending their measurements back to various decision making and actuating notes. Since batteries are not getting better (~1 eV/atom at best), these sensors have to be small and highly efficient. How do you do crypto, authentication, validation, etc. on a 4, 8 (or maybe 16) bit processor with a severely limited energy budget and a short word length.. Already engineers have demonstrated that they can access the main network in a car by spoofing the un-encrypted tire pressure sensors. This is just the start. There is room for some interesting algorithmic innovation here. --Rich On Sun, Mar 20, 2016 at 1:04 PM, Eugene Salamin via math-fun < math-fun@mailman.xmission.com> wrote:
If the entropy per bit is s, the total entropy of a sequence of N0 bits is S = N0 s. The number of distinct sequences, or the number of keys that need to be tried in a brute force attack, is approximately N = exp(S). Knowing that provides no hint about how to choose the trial keys; it's just saying what a very clever cryptanalyst could do in principle.
Similarly, if S is the thermodynamic entropy of a physical system, the number of distinct quantum states that the system can be in, consistent with its thermodynamic state, is N = exp(S/k). When heat energy dQ is added to a system at temperature T, the entropy increase is dS = dQ/T, so that the units of entropy are (energy/temperature). Boltzmann's constant is k = 1.38e-23 J/K.
Looking into tables of thermodynamic data, at 25 C and 1 atm pressure, the standard molar entropy of diamond is 2.377 J/(K mol). A 1 mole (12 g) diamond crystal, in equilibrium with its external environment, shuffles among 10^(7.5×10²²) quantum states. The standard molar entropy of uranium hexafluoride gas is 378 J/(K mol). A mole (352 g) of UF6 shares 10^(1.2×10²⁵) quantum states.
-- Gene
From: Dan Asimov <dasimov@earthlink.net> To: Eugene Salamin <gene_salamin@yahoo.com>; math-fun < math-fun@mailman.xmission.com> Sent: Sunday, March 20, 2016 1:37 AM Subject: Re: [math-fun] true random generators
Gene, what does the entropy per bit tell us about the TRNG ?
(If the answer is contained in what you wrote below, I'm too ignorant about RNG's to get it.)
—Dan
On Mar 19, 2016, at 7:04 PM, Eugene Salamin via math-fun < math-fun@mailman.xmission.com> wrote:
I don't think accurate balancing is needed. The entropy per bit is S = -p log p - (1-p) log(1-p). So S(50,50) = 0.693, S(60,40) = 0.673 for a fractional loss of 3%. And S(75,25) = 0.562, a loss of 19%. That can be compensated by correspondingly increasing the key length.
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun