On Jan 7, 2019, at 12:32 PM, Richard Howard <rich@richardehoward.com> wrote:
Some thoughts from the physics side of the house.
1) If the multiple levels are represented by physical quantities like charge, voltage, etc., the energy tends to go as the square, so stacking bits increases energy faster than it increases information. (the way to beat this is to use multiple independent degrees of freedom, but if you thought ternary was complicated, try to build a computer with multiple independent degrees of freedom)
Agree. This is the point I tried (but obviously failed) to make.
2) Low temperature sounds great, but energy tends to go down with temperature much slower than heat transfer. Low temperature computers are really hard to cool.
Agreed.
3) Practical implementation of adiabatic computation is limited by the fact that there are no perfect switches at finite temperature. To work, adiabatic schemes have to slosh energy from one reservoir to another with some kind of a switch. At finite temperature, these switches have "soft thresholds" that depend on T -- think of a diode where I=Io*exp(qV/kT) The result is that the switches have an effective finite voltage drop when they are suppose to be in the "on" state. Practically, this is 0.3 to 0.6V at room temperature--tough to win big when CMOS can operate at voltages just above this with far less complexity.
I don’t agree with this. CMOS gates do not exhibit a source/drain voltage drop when on. They do require a gate voltage of the magnitude you suggest, but this is provided by the previous state of the computer, and the energy required to produce it is recoverable by reversing the computation.