Eugene Salamin writes:
Speaking as a professional physicist, I will say that the "physics is information" idea is as useless as the "physics is computation" thesis of Fredkin and Wolfram. It has never given me a single insight.
Speaking as a professional computer engineer, it is one of the most important insights we have available in understanding the world. Different strokes...
There's alot of confusion concerning entropy because distinct concepts, the thermodynamic and the information theoretic one, call themselves by same name, and sometimes even share formulas.
This is not conflation of two distinct ideas, but the same idea. All the work on dissipationless computation and reversibility relies directly on this foundation, and it wouldn't work if these were two different ideas which "happen" to share the same formulas. I'd suggest you understand some of this work before making this strong assertion. In particular, I'd recommend Norm Margolus's Ph.D. thesis.
Physical information must be carried by physical objects, and these have energy. You cannot acquire information about a physical system if you insulate it from transfer of energy, particles, etc.
Absolutely.
But of course Maxwell's demon never does work, because the cost inequality is always the other way around.
Absolutely.
This is totally wrong. Life is completly consistent with the laws of theromdynamics. That is why you either eat food or lie in the sun, depending on what form of life you are.
Absolutely.
Gubbish. Entropy increases because we configure physical systems into states they would not evolve into of their own accord.
Well, there is something unusual about the initial state we find the universe in.