Re: [math-fun] triple star
Actually "entropy" doesn't have much meaning in a reversible system as a whole. It only starts taking on meaning when you look at _part_ of the system, such that you have only partial information. I actually have an enormous nit to pick with the way thermodynamics is taught. Physicists focus on "energy" and "entropy", whereas "information" is really much more important. And, _no_, "information" is much more than "negative entropy", because physicists never really get around to being precise about entropy in the first place. E.g., "temperature" is defined as being some sort of integrating function in some textbooks! For example, when physicists talk about a "closed" thermodynamic system, they talk about it being insulated from "heat" (whatever that is!) and insulated from the transfer of "energy" (at least _that_ seems to be reasonably well defined!). However, one can conceive of such systems as being insulated in the normal physical sense, but _not_ insulated from the transfer of information in/out. Due to this imprecision, most of the papers I've read about Maxwell's Daemon are nonsensical. Maxwell's Daemon works perfectly well so long as the energy cost of gathering the information is less than the energy gained. As a situation becomes more complex, the energy variations (hence the amount to be gained) becomes smaller, while the cost to gain the information becomes larger, and it eventually becomes not worth the effort. All "life" is a version of Maxwell's Daemon in action. Modern quantum physics makes it essentially impossible to have a "closed" subset of a quantum universe. What quantum physics is really telling us is that information about the evolution of the system seems to pervade all of the universe. An interesting question is whether even black holes can hide the information within -- I conjecture not even temporarily. (See Ahranov-Boehm (sp?) effect.) So it may be that "thermodynamics" needs even more of a reworking for quantum physics than it has gotten so far. At 06:32 PM 7/5/03 EDT, asimovd@aol.com wrote:
"Entropy increases" is not a law of physics, but rather a consequence of initial conditions. In our universe it can be traced back to our local stars' having "advanced" rather than "retarded" radiation, or more accurately to their radiation being emanated toward us in the same direction that we perceive ourselves to grow older. (So if there are stars with "retarded" radiation, i.e., being emanated in the time direction we call negative, their local life forms would see those stars just as we see ours, timewise.)
--Dan
--- Henry Baker <hbaker1@pipeline.com> wrote:
Actually "entropy" doesn't have much meaning in a reversible system as a whole. It only starts taking on meaning when you look at _part_ of the system, such that you have only partial information.
You never have anything but partial information when dealing with macroscopic physical systems. If you had complete information, you wouldn't need thermodynamics. Being a state function (like pressure and temperature), strictly speaking, entropy is defined only for systems in thermodynamic equilibrium.
I actually have an enormous nit to pick with the way thermodynamics is taught. Physicists focus on "energy" and "entropy", whereas "information" is really much more important.
Speaking as a professional physicist, I will say that the "physics is information" idea is as useless as the "physics is computation" thesis of Fredkin and Wolfram. It has never given me a single insight.
And, _no_, "information" is much more than "negative entropy", because physicists never really get around to being precise about entropy in the first place. E.g., "temperature" is defined as being some sort of integrating function in some textbooks!
There's alot of confusion concerning entropy because distinct concepts, the thermodynamic and the information theoretic one, call themselves by same name, and sometimes even share formulas. It would take more time than I can spare to properly discuss this. However, I will say that physics does precisely define temperature and entropy. I suggest Fermi's short book "Thermodynamics" published by Dover, for a clear account. The 1st and 2nd laws of thermodynamics are the first order of business, so you can read from the beginning and go as far as you wish.
For example, when physicists talk about a "closed" thermodynamic system, they talk about it being insulated from "heat" (whatever that is!)
Heat is the kinetic energy of motion of the constituents of a macroscopic physical system.
and insulated from the transfer of "energy" (at least _that_ seems to be reasonably well defined!). However, one can conceive of such systems as being insulated in the normal physical sense, but _not_ insulated from the transfer of information in/out.
Physical information must be carried by physical objects, and these have energy. You cannot acquire information about a physical system if you insulate it from transfer of energy, particles, etc.
Due to this imprecision, most of the papers I've read about Maxwell's Daemon are nonsensical.
Maxwell's Daemon works perfectly well so long as the energy cost of gathering the information is less than the energy gained.
But of course Maxwell's demon never does work, because the cost inequality is always the other way around.
As a situation becomes more complex, the energy variations (hence the amount to be gained) becomes smaller, while the cost to gain the information becomes larger, and it eventually becomes not worth the effort.
It's always this way.
All "life" is a version of Maxwell's Daemon in action.
This is totally wrong. Life is completly consistent with the laws of theromdynamics. That is why you either eat food or lie in the sun, depending on what form of life you are.
Modern quantum physics makes it essentially impossible to have a "closed" subset of a quantum universe. What quantum physics is really telling us is that information about the evolution of the system seems to pervade all of the universe. An interesting question is whether even black holes can hide the information within -- I conjecture not even temporarily. (See Ahranov-Boehm (sp?) effect.)
Aharanov-Bohm effect is the phenomenon that in an electron-wave interferometer if a magnetic flux threads between the two electron paths, there will be a fringe shift as the flux is varied, even though the electrons traverse entirely through regions free of magnetic fields.
So it may be that "thermodynamics" needs even more of a reworking for quantum physics than it has gotten so far.
At 06:32 PM 7/5/03 EDT, asimovd@aol.com wrote:
"Entropy increases" is not a law of physics, but rather a consequence of initial conditions. In our universe it can be traced back to our local stars' having "advanced" rather than "retarded" radiation, or more accurately to their radiation being emanated toward us in the same direction that we perceive ourselves to grow older. (So if there are stars with "retarded" radiation, i.e., being emanated in the time direction we call negative, their local life forms would see those stars just as we see ours, timewise.)
--Dan
Gubbish. Entropy increases because we configure physical systems into states they would not evolve into of their own accord. __________________________________ Do you Yahoo!? SBC Yahoo! DSL - Now only $29.95 per month! http://sbc.yahoo.com
Eugene Salamin writes:
Speaking as a professional physicist, I will say that the "physics is information" idea is as useless as the "physics is computation" thesis of Fredkin and Wolfram. It has never given me a single insight.
Speaking as a professional computer engineer, it is one of the most important insights we have available in understanding the world. Different strokes...
There's alot of confusion concerning entropy because distinct concepts, the thermodynamic and the information theoretic one, call themselves by same name, and sometimes even share formulas.
This is not conflation of two distinct ideas, but the same idea. All the work on dissipationless computation and reversibility relies directly on this foundation, and it wouldn't work if these were two different ideas which "happen" to share the same formulas. I'd suggest you understand some of this work before making this strong assertion. In particular, I'd recommend Norm Margolus's Ph.D. thesis.
Physical information must be carried by physical objects, and these have energy. You cannot acquire information about a physical system if you insulate it from transfer of energy, particles, etc.
Absolutely.
But of course Maxwell's demon never does work, because the cost inequality is always the other way around.
Absolutely.
This is totally wrong. Life is completly consistent with the laws of theromdynamics. That is why you either eat food or lie in the sun, depending on what form of life you are.
Absolutely.
Gubbish. Entropy increases because we configure physical systems into states they would not evolve into of their own accord.
Well, there is something unusual about the initial state we find the universe in.
Classical thermodynamics, including a basic concept of entropy satisfying dS = dQ/T, where dQ is not an exact differential but dS is exact, is a well developed theory not involving a concept of information or of statistics. It is enormously useful in physics, chemistry and many branches of engineering. It isn't much used in computer science, so most computer scientists manage to remain ignorant of it. Classical thermodynamics won't go away any more than Newtonian mechanics will go away. [Remark: While entropy applies to equilibrium states, the concept has been stretched to usefully fit some rather rapid processes, e.g. in aerodynamics where equilibria changing over microseconds are useful concepts.] Already in the middle of the 19th century, Maxwell and Boltzmann undertook to base thermodynamics on statistical mechanics and kinetic theory of gases. They did not succeed in getting everything they wanted about classical thermodynamics from statistical mechanics, and their relations remain a subject of research and controversy even today. None of this is in opposition to most of the postings on the subject, and I rather like the notion entropy being connected with initial conditions.
participants (4)
-
Eugene Salamin -
Henry Baker -
John McCarthy -
Tom Knight