I asked a physicist precisely the same question back in the early 1990's; so far as I could tell, he didn't understand the question. Now that computer scientists & physicists are at least speaking the same language (at least when talking about quantum computers), the question needs to be asked again. My best understanding of physics & information is the following: Since the evolution of physical reality is "unitary", the volume of "volume elements" of phase space are conserved. This means that time-reversal can be contemplated, at least in theory. If one believes in "collapse of the wave function" due to "measurement", then physics still has to preserve some of these properties. In particular, in order to conserve phase space volume, if an amount of information of N bits is created, then an equal # of bits of information has to be destroyed. Otherwise, thermodynamics/entropy/heat engines/Maxwell's Daemons don't work right. So even if information can be created/destroyed in our universe, the _amount_ of such information must remain constant. In particular, Hawking/Bekenstein tell us that the _amount_ of information within a physical volume is precisely determined by the surface area surrounding that volume. So for the universe to "inflate", a gigantic amount of information has to be quickly "pumped into" the universe. Try convincing your local creationist about the absence of a deity after you tell him/her about this little problem... At 05:00 PM 5/27/2012, Warren Smith wrote:
Now here's a little thought experiment that suggests Hawking is wrong. Consider the big bang that created our universe. Did it generate lots of information from {little or none}? If so, we would seem to have a counterexample to Hawking's "physics is information-preserving (unitary)" claim.