[math-fun] The Paradoxes That Threaten To Tear Modern Cosmology Apart
FYI -- https://medium.com/the-physics-arxiv-blog/the-paradoxes-that-threaten-to-tea... The Paradoxes That Threaten To Tear Modern Cosmology Apart Some simple observations about the universe seem to contradict basic physics. Solving these paradoxes could change the way we think about the cosmos Revolutions in science often come from the study of seemingly unresolvable paradoxes. An intense focus on these paradoxes, and their eventual resolution, is a process that has leads to many important breakthroughs. So an interesting exercise is to list the paradoxes associated with current ideas in science. ItÂs just possible that these paradoxes will lead to the next generation of ideas about the universe. Today, Yurij Baryshev at St Petersburg State University in Russia does just this with modern cosmology. The result is a list of paradoxes associated with well-established ideas and observations about the structure and origin of the universe. Perhaps the most dramatic, and potentially most important, of these paradoxes comes from the idea that the universe is expanding, one of the great successes of modern cosmology. It is based on a number of different observations. The first is that other galaxies are all moving away from us. The evidence for this is that light from these galaxies is red-shifted. And the greater the distance, the bigger this red-shift. Astrophysicists interpret this as evidence that more distant galaxies are travelling away from us more quickly. Indeed, the most recent evidence is that the expansion is accelerating. WhatÂs curious about this expansion is that space, and the vacuum associated with it, must somehow be created in this process. And yet how this can occur is not at all clear. The creation of space is a new cosmological phenomenon, which has not been tested yet in physical laboratory, says Baryshev. WhatÂs more, there is an energy associated with any given volume of the universe. If that volume increases, the inescapable conclusion is that this energy must increase as well. And yet physicists generally think that energy creation is forbidden. Baryshev quotes the British cosmologist, Ted Harrison, on this topic: ÂThe conclusion, whether we like it or not, is obvious: energy in the universe is not conserved, says Harrison. This is a problem that cosmologists are well aware of. And yet ask them about it and they shuffle their feet and stare at the ground. Clearly, any theorist who can solve this paradox will have a bright future in cosmology. The nature of the energy associated with the vacuum is another puzzle. This is variously called the zero point energy or the energy of the Planck vacuum and quantum physicists have spent some time attempting to calculate it. These calculations suggest that the energy density of the vacuum is huge, of the order of 10^94 g/cm^3. This energy, being equivalent to mass, ought to have a gravitational effect on the universe. Cosmologists have looked for this gravitational effect and calculated its value from their observations (they call it the cosmological constant). These calculations suggest that the energy density of the vacuum is about 10^-29 g/cm3. Those numbers are difficult to reconcile. Indeed, they differ by 120 orders of magnitude. How and why this discrepancy arises is not known and is the cause of much bemused embarrassment among cosmologists. Then there is the cosmological red-shift itself, which is another mystery. Physicists often talk about the red-shift as a kind of Doppler effect, like the change in frequency of a police siren as it passes by. The Doppler effect arises from the relative movement of different objects. But the cosmological red-shift is different because galaxies are stationary in space. Instead, it is space itself that cosmologists think is expanding. The mathematics that describes these effects is correspondingly different as well, not least because any relative velocity must always be less than the speed of light in conventional physics. And yet the velocity of expanding space can take any value. Interestingly, the nature of the cosmological red-shift leads to the possibility of observational tests in the next few years. One interesting idea is that the red-shifts of distant objects must increase as they get further away. For a distant quasar, this change may be as much as one centimetre per second per year, something that may be observable with the next generation of extremely large telescopes. One final paradox is also worth mentioning. This comes from one of the fundamental assumptions behind EinsteinÂs theory of general relativityÂ-that if you look at the universe on a large enough scale, it must be the same in all directions. It seems clear that this assumption of homogeneity does not hold on the local scale. Our galaxy is part of a cluster known as the Local Group which is itself part of a bigger supercluster. This suggests a kind of fractal structure to the universe. In other words, the universe is made up of clusters regardless of the scale at which you look at it. The problem with this is that it contradicts one of the basic ideas of modern cosmologyÂ-the Hubble law. This is the observation that the cosmological red-shift of an object is linearly proportional to its distance from Earth. It is so profoundly embedded in modern cosmology that most currently accepted theories of universal expansion depend on its linear nature. ThatÂs all okay if the universe is homogeneous (and therefore linear) on the largest scales. But the evidence is paradoxical. Astrophysicists have measured the linear nature of the Hubble law at distances of a few hundred megaparsecs. And yet the clusters visible on those scales indicate the universe is not homogeneous on the scales. And so the argument that the Hubble lawÂs linearity is a result of the homogeneity of the universe (or vice versa) does not stand up to scrutiny. Once again this is an embarrassing failure for modern cosmology. It is sometimes tempting to think that astrophysicists have cosmology more or less sewn up, that the Big Bang model, and all that it implies, accounts for everything we see in the cosmos. Not even close. Cosmologists may have successfully papered over the cracks in their theories in a way that keeps scientists happy for the time being. This sense of success is surely an illusion. And that is how it should be. If scientists really think they are coming close to a final and complete description of reality, then a simple list of paradoxes can do a remarkable job of putting feet firmly back on the ground. Ref: arxiv.org/abs/1501.01919 : Paradoxes Of Cosmological Physics In The Beginning Of The 21-St Century http://arxiv.org/abs/1501.01919
I'll have to read the paper itself, but there's a lot ovelooked in this summary of it. See below. On 1/21/2015 6:10 PM, Henry Baker wrote:
FYI --
https://medium.com/the-physics-arxiv-blog/the-paradoxes-that-threaten-to-tea...
The Paradoxes That Threaten To Tear Modern Cosmology Apart
Some simple observations about the universe seem to contradict basic physics. Solving these paradoxes could change the way we think about the cosmos
Revolutions in science often come from the study of seemingly unresolvable paradoxes. An intense focus on these paradoxes, and their eventual resolution, is a process that has leads to many important breakthroughs.
So an interesting exercise is to list the paradoxes associated with current ideas in science. It’s just possible that these paradoxes will lead to the next generation of ideas about the universe.
Today, Yurij Baryshev at St Petersburg State University in Russia does just this with modern cosmology. The result is a list of paradoxes associated with well-established ideas and observations about the structure and origin of the universe.
Perhaps the most dramatic, and potentially most important, of these paradoxes comes from the idea that the universe is expanding, one of the great successes of modern cosmology. It is based on a number of different observations.
The first is that other galaxies are all moving away from us. The evidence for this is that light from these galaxies is red-shifted. And the greater the distance, the bigger this red-shift.
Astrophysicists interpret this as evidence that more distant galaxies are travelling away from us more quickly. Indeed, the most recent evidence is that the expansion is accelerating.
What’s curious about this expansion is that space, and the vacuum associated with it, must somehow be created in this process. And yet how this can occur is not at all clear. The creation of space is a new cosmological phenomenon, which has not been tested yet in physical laboratory,” says Baryshev.
de Sitter showed there were expanding cosmological solutions in GR shortly after Einstein published, and Hubble observed the expansion. It is extremely small for laboratory scale detection.
What’s more, there is an energy associated with any given volume of the universe. If that volume increases, the inescapable conclusion is that this energy must increase as well. And yet physicists generally think that energy creation is forbidden.
Most physicists who know GR are aware that total stress-energy can't be defined in a curved spacetime. There's no canonical way to transport the tensors so as to add them together. So they don't think energy creation is forbidden, it's just ill defined.
Baryshev quotes the British cosmologist, Ted Harrison, on this topic: “The conclusion, whether we like it or not, is obvious: energy in the universe is not conserved,” says Harrison.
This is a problem that cosmologists are well aware of. And yet ask them about it and they shuffle their feet and stare at the ground. Clearly, any theorist who can solve this paradox will have a bright future in cosmology.
The nature of the energy associated with the vacuum is another puzzle. This is variously called the zero point energy or the energy of the Planck vacuum and quantum physicists have spent some time attempting to calculate it.
These calculations suggest that the energy density of the vacuum is huge, of the order of 10^94 g/cm^3. This energy, being equivalent to mass, ought to have a gravitational effect on the universe.
Cosmologists have looked for this gravitational effect and calculated its value from their observations (they call it the cosmological constant). These calculations suggest that the energy density of the vacuum is about 10^-29 g/cm3.
Those numbers are difficult to reconcile. Indeed, they differ by 120 orders of magnitude. How and why this discrepancy arises is not known and is the cause of much bemused embarrassment among cosmologists.
This is a long standing problem, but recently a solution has suggested itself. The enormous number (which no one ever believed) comes from assuming that every Planck sized volume represents one degree of freedom for each elementary particle field. Because of Heisenberg's uncertainty the lowest energy possible for each degree of freedom is not zero, rather it is 1Planck (in natural units) and the energy density is (1Planck)^4 for each different elementary field which is some relatively small number (18?). But Bekenstein and Hawking showed that a black hole has degrees of freedom equal to the area of it's horizon in Planck units and this is maximal for the given mass-energy. Since then it is suspected that counting a degree of freedom for each Planck volume is over counting. The analog of the BH horizon in the Hubble sphere, the two-surface that's receding from us a c. If you assume that the degrees of freedom are one per Planck area on the Hubble sphere (proportional to area instead of volume), then the calculated energy density for the sphere is the right order of magnitude. So this is a possible solution that people are trying to make more precise.
Then there is the cosmological red-shift itself, which is another mystery. Physicists often talk about the red-shift as a kind of Doppler effect, like the change in frequency of a police siren as it passes by.
The Doppler effect arises from the relative movement of different objects. But the cosmological red-shift is different because galaxies are stationary in space. Instead, it is space itself that cosmologists think is expanding.
The mathematics that describes these effects is correspondingly different as well, not least because any relative velocity must always be less than the speed of light in conventional physics. And yet the velocity of expanding space can take any value.
Interestingly, the nature of the cosmological red-shift leads to the possibility of observational tests in the next few years. One interesting idea is that the red-shifts of distant objects must increase as they get further away. For a distant quasar, this change may be as much as one centimetre per second per year, something that may be observable with the next generation of extremely large telescopes.
One final paradox is also worth mentioning. This comes from one of the fundamental assumptions behind Einstein’s theory of general relativity—-that if you look at the universe on a large enough scale, it must be the same in all directions.
It seems clear that this assumption of homogeneity does not hold on the local scale. Our galaxy is part of a cluster known as the Local Group which is itself part of a bigger supercluster.
This suggests a kind of fractal structure to the universe. In other words, the universe is made up of clusters regardless of the scale at which you look at it.
To say "it seems clear" is a lot stronger than "this suggests". It may be the universe is homogenous at the level of superclusters. If not, then I think you need a specific model of it's fractal structure on which calculations can be based to compare with observations. The uniformity of the CMB would likely be upset by most fractal models.
The problem with this is that it contradicts one of the basic ideas of modern cosmology—-the Hubble law. This is the observation that the cosmological red-shift of an object is linearly proportional to its distance from Earth.
It is so profoundly embedded in modern cosmology that most currently accepted theories of universal expansion depend on its linear nature. That’s all okay if the universe is homogeneous (and therefore linear) on the largest scales.
But the evidence is paradoxical. Astrophysicists have measured the linear nature of the Hubble law at distances of a few hundred megaparsecs. And yet the clusters visible on those scales indicate the universe is not homogeneous on the scales.
And so the argument that the Hubble law’s linearity is a result of the homogeneity of the universe (or vice versa) does not stand up to scrutiny. Once again this is an embarrassing failure for modern cosmology.
It is sometimes tempting to think that astrophysicists have cosmology more or less sewn up, that the Big Bang model, and all that it implies, accounts for everything we see in the cosmos.
Not even close. Cosmologists may have successfully papered over the cracks in their theories in a way that keeps scientists happy for the time being. This sense of success is surely an illusion.
And that is how it should be. If scientists really think they are coming close to a final and complete description of reality, then a simple list of paradoxes can do a remarkable job of putting feet firmly back on the ground.
I think a much more immediate problem is the nature of dark matter. On the more purely theoretical front there seems to be a direct contradiction (not just a paradox) between the equivalence principle of general relativity and quantum mechanics. The former says that falling through horizon of a large black hole would not even be noticeable. The latter says something drastic must happen there to maintain unitary evolution of quantum state.
Ref: arxiv.org/abs/1501.01919 : Paradoxes Of Cosmological Physics In The Beginning Of The 21-St Century
Thanks for posting it. Brent Meeker
This is a long standing problem, but recently a solution has suggested itself. The enormous number (which no one ever believed) comes from assuming that every Planck sized volume represents one degree of freedom for each elementary particle field. Because of Heisenberg's uncertainty the lowest energy possible for each degree of freedom is not zero, rather it is 1Planck (in natural units) and the energy density is (1Planck)^4 for each different elementary field which is some relatively small number (18?). But Bekenstein and Hawking showed that a black hole has degrees of freedom equal to the area of it's horizon in Planck units and this is maximal for the given mass-energy. Since then it is suspected that counting a degree of freedom for each Planck volume is over counting. The analog of the BH horizon in the Hubble sphere, the two-surface that's receding from us a c. If you assume that the degrees of freedom are one per Planck area on the Hubble sphere (proportional to area instead of volume), then the calculated energy density for the sphere is the right order of magnitude. So this is a possible solution that people are trying to make more precise.
Very interesting! Do you know of any papers talking about this as a solution for the vacuum energy density problem? I'm familiar with Bekenstein-Hawking entropy but not with this application. Charles Greathouse Analyst/Programmer Case Western Reserve University On Thu, Jan 22, 2015 at 1:30 AM, meekerdb <meekerdb@verizon.net> wrote:
I'll have to read the paper itself, but there's a lot ovelooked in this summary of it. See below.
On 1/21/2015 6:10 PM, Henry Baker wrote:
FYI --
https://medium.com/the-physics-arxiv-blog/the-paradoxes-that-threaten-to- tear-modern-cosmology-apart-d334a7fcfdb6
The Paradoxes That Threaten To Tear Modern Cosmology Apart
Some simple observations about the universe seem to contradict basic physics. Solving these paradoxes could change the way we think about the cosmos
Revolutions in science often come from the study of seemingly unresolvable paradoxes. An intense focus on these paradoxes, and their eventual resolution, is a process that has leads to many important breakthroughs.
So an interesting exercise is to list the paradoxes associated with current ideas in science. It’s just possible that these paradoxes will lead to the next generation of ideas about the universe.
Today, Yurij Baryshev at St Petersburg State University in Russia does just this with modern cosmology. The result is a list of paradoxes associated with well-established ideas and observations about the structure and origin of the universe.
Perhaps the most dramatic, and potentially most important, of these paradoxes comes from the idea that the universe is expanding, one of the great successes of modern cosmology. It is based on a number of different observations.
The first is that other galaxies are all moving away from us. The evidence for this is that light from these galaxies is red-shifted. And the greater the distance, the bigger this red-shift.
Astrophysicists interpret this as evidence that more distant galaxies are travelling away from us more quickly. Indeed, the most recent evidence is that the expansion is accelerating.
What’s curious about this expansion is that space, and the vacuum associated with it, must somehow be created in this process. And yet how this can occur is not at all clear. The creation of space is a new cosmological phenomenon, which has not been tested yet in physical laboratory,” says Baryshev.
de Sitter showed there were expanding cosmological solutions in GR shortly after Einstein published, and Hubble observed the expansion. It is extremely small for laboratory scale detection.
What’s more, there is an energy associated with any given volume of the universe. If that volume increases, the inescapable conclusion is that this energy must increase as well. And yet physicists generally think that energy creation is forbidden.
Most physicists who know GR are aware that total stress-energy can't be defined in a curved spacetime. There's no canonical way to transport the tensors so as to add them together. So they don't think energy creation is forbidden, it's just ill defined.
Baryshev quotes the British cosmologist, Ted Harrison, on this topic: “The conclusion, whether we like it or not, is obvious: energy in the universe is not conserved,” says Harrison.
This is a problem that cosmologists are well aware of. And yet ask them about it and they shuffle their feet and stare at the ground. Clearly, any theorist who can solve this paradox will have a bright future in cosmology.
The nature of the energy associated with the vacuum is another puzzle. This is variously called the zero point energy or the energy of the Planck vacuum and quantum physicists have spent some time attempting to calculate it.
These calculations suggest that the energy density of the vacuum is huge, of the order of 10^94 g/cm^3. This energy, being equivalent to mass, ought to have a gravitational effect on the universe.
Cosmologists have looked for this gravitational effect and calculated its value from their observations (they call it the cosmological constant). These calculations suggest that the energy density of the vacuum is about 10^-29 g/cm3.
Those numbers are difficult to reconcile. Indeed, they differ by 120 orders of magnitude. How and why this discrepancy arises is not known and is the cause of much bemused embarrassment among cosmologists.
This is a long standing problem, but recently a solution has suggested itself. The enormous number (which no one ever believed) comes from assuming that every Planck sized volume represents one degree of freedom for each elementary particle field. Because of Heisenberg's uncertainty the lowest energy possible for each degree of freedom is not zero, rather it is 1Planck (in natural units) and the energy density is (1Planck)^4 for each different elementary field which is some relatively small number (18?). But Bekenstein and Hawking showed that a black hole has degrees of freedom equal to the area of it's horizon in Planck units and this is maximal for the given mass-energy. Since then it is suspected that counting a degree of freedom for each Planck volume is over counting. The analog of the BH horizon in the Hubble sphere, the two-surface that's receding from us a c. If you assume that the degrees of freedom are one per Planck area on the Hubble sphere (proportional to area instead of volume), then the calculated energy density for the sphere is the right order of magnitude. So this is a possible solution that people are trying to make more precise.
Then there is the cosmological red-shift itself, which is another mystery. Physicists often talk about the red-shift as a kind of Doppler effect, like the change in frequency of a police siren as it passes by.
The Doppler effect arises from the relative movement of different objects. But the cosmological red-shift is different because galaxies are stationary in space. Instead, it is space itself that cosmologists think is expanding.
The mathematics that describes these effects is correspondingly different as well, not least because any relative velocity must always be less than the speed of light in conventional physics. And yet the velocity of expanding space can take any value.
Interestingly, the nature of the cosmological red-shift leads to the possibility of observational tests in the next few years. One interesting idea is that the red-shifts of distant objects must increase as they get further away. For a distant quasar, this change may be as much as one centimetre per second per year, something that may be observable with the next generation of extremely large telescopes.
One final paradox is also worth mentioning. This comes from one of the fundamental assumptions behind Einstein’s theory of general relativity—-that if you look at the universe on a large enough scale, it must be the same in all directions.
It seems clear that this assumption of homogeneity does not hold on the local scale. Our galaxy is part of a cluster known as the Local Group which is itself part of a bigger supercluster.
This suggests a kind of fractal structure to the universe. In other words, the universe is made up of clusters regardless of the scale at which you look at it.
To say "it seems clear" is a lot stronger than "this suggests". It may be the universe is homogenous at the level of superclusters. If not, then I think you need a specific model of it's fractal structure on which calculations can be based to compare with observations. The uniformity of the CMB would likely be upset by most fractal models.
The problem with this is that it contradicts one of the basic ideas of modern cosmology—-the Hubble law. This is the observation that the cosmological red-shift of an object is linearly proportional to its distance from Earth.
It is so profoundly embedded in modern cosmology that most currently accepted theories of universal expansion depend on its linear nature. That’s all okay if the universe is homogeneous (and therefore linear) on the largest scales.
But the evidence is paradoxical. Astrophysicists have measured the linear nature of the Hubble law at distances of a few hundred megaparsecs. And yet the clusters visible on those scales indicate the universe is not homogeneous on the scales.
And so the argument that the Hubble law’s linearity is a result of the homogeneity of the universe (or vice versa) does not stand up to scrutiny. Once again this is an embarrassing failure for modern cosmology.
It is sometimes tempting to think that astrophysicists have cosmology more or less sewn up, that the Big Bang model, and all that it implies, accounts for everything we see in the cosmos.
Not even close. Cosmologists may have successfully papered over the cracks in their theories in a way that keeps scientists happy for the time being. This sense of success is surely an illusion.
And that is how it should be. If scientists really think they are coming close to a final and complete description of reality, then a simple list of paradoxes can do a remarkable job of putting feet firmly back on the ground.
I think a much more immediate problem is the nature of dark matter. On the more purely theoretical front there seems to be a direct contradiction (not just a paradox) between the equivalence principle of general relativity and quantum mechanics. The former says that falling through horizon of a large black hole would not even be noticeable. The latter says something drastic must happen there to maintain unitary evolution of quantum state.
Ref: arxiv.org/abs/1501.01919 : Paradoxes Of Cosmological Physics In The Beginning Of The 21-St Century
Thanks for posting it.
Brent Meeker
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
I know of from helping edit one of Vic Stenger's books. It was suggested to him by our mutual friend Bob Zanelli. https://groups.google.com/forum/#!topic/atvoid/_wcFErCy0og But the earliest paper I've found is from Mok: http://arxiv.org/ftp/physics/papers/0408/0408060.pdf Here's what Vic wrote http://www.colorado.edu/philosophy/vstenger/Fallacy/FTCosmo.pdf Seen page 14. So as I wrote it's not really worked out; it's more of an idea for a solution. Brent On 1/22/2015 6:26 AM, Charles Greathouse wrote:
This is a long standing problem, but recently a solution has suggested itself. The enormous number (which no one ever believed) comes from assuming that every Planck sized volume represents one degree of freedom for each elementary particle field. Because of Heisenberg's uncertainty the lowest energy possible for each degree of freedom is not zero, rather it is 1Planck (in natural units) and the energy density is (1Planck)^4 for each different elementary field which is some relatively small number (18?). But Bekenstein and Hawking showed that a black hole has degrees of freedom equal to the area of it's horizon in Planck units and this is maximal for the given mass-energy. Since then it is suspected that counting a degree of freedom for each Planck volume is over counting. The analog of the BH horizon in the Hubble sphere, the two-surface that's receding from us a c. If you assume that the degrees of freedom are one per Planck area on the Hubble sphere (proportional to area instead of volume), then the calculated energy density for the sphere is the right order of magnitude. So this is a possible solution that people are trying to make more precise.
Very interesting! Do you know of any papers talking about this as a solution for the vacuum energy density problem? I'm familiar with Bekenstein-Hawking entropy but not with this application.
Charles Greathouse Analyst/Programmer Case Western Reserve University
On Thu, Jan 22, 2015 at 1:30 AM, meekerdb <meekerdb@verizon.net> wrote:
I'll have to read the paper itself, but there's a lot ovelooked in this summary of it. See below.
On 1/21/2015 6:10 PM, Henry Baker wrote:
FYI --
https://medium.com/the-physics-arxiv-blog/the-paradoxes-that-threaten-to- tear-modern-cosmology-apart-d334a7fcfdb6
The Paradoxes That Threaten To Tear Modern Cosmology Apart
Some simple observations about the universe seem to contradict basic physics. Solving these paradoxes could change the way we think about the cosmos
Revolutions in science often come from the study of seemingly unresolvable paradoxes. An intense focus on these paradoxes, and their eventual resolution, is a process that has leads to many important breakthroughs.
So an interesting exercise is to list the paradoxes associated with current ideas in science. It’s just possible that these paradoxes will lead to the next generation of ideas about the universe.
Today, Yurij Baryshev at St Petersburg State University in Russia does just this with modern cosmology. The result is a list of paradoxes associated with well-established ideas and observations about the structure and origin of the universe.
Perhaps the most dramatic, and potentially most important, of these paradoxes comes from the idea that the universe is expanding, one of the great successes of modern cosmology. It is based on a number of different observations.
The first is that other galaxies are all moving away from us. The evidence for this is that light from these galaxies is red-shifted. And the greater the distance, the bigger this red-shift.
Astrophysicists interpret this as evidence that more distant galaxies are travelling away from us more quickly. Indeed, the most recent evidence is that the expansion is accelerating.
What’s curious about this expansion is that space, and the vacuum associated with it, must somehow be created in this process. And yet how this can occur is not at all clear. The creation of space is a new cosmological phenomenon, which has not been tested yet in physical laboratory,” says Baryshev.
de Sitter showed there were expanding cosmological solutions in GR shortly after Einstein published, and Hubble observed the expansion. It is extremely small for laboratory scale detection.
What’s more, there is an energy associated with any given volume of the universe. If that volume increases, the inescapable conclusion is that this energy must increase as well. And yet physicists generally think that energy creation is forbidden.
Most physicists who know GR are aware that total stress-energy can't be defined in a curved spacetime. There's no canonical way to transport the tensors so as to add them together. So they don't think energy creation is forbidden, it's just ill defined.
Baryshev quotes the British cosmologist, Ted Harrison, on this topic: “The conclusion, whether we like it or not, is obvious: energy in the universe is not conserved,” says Harrison.
This is a problem that cosmologists are well aware of. And yet ask them about it and they shuffle their feet and stare at the ground. Clearly, any theorist who can solve this paradox will have a bright future in cosmology.
The nature of the energy associated with the vacuum is another puzzle. This is variously called the zero point energy or the energy of the Planck vacuum and quantum physicists have spent some time attempting to calculate it.
These calculations suggest that the energy density of the vacuum is huge, of the order of 10^94 g/cm^3. This energy, being equivalent to mass, ought to have a gravitational effect on the universe.
Cosmologists have looked for this gravitational effect and calculated its value from their observations (they call it the cosmological constant). These calculations suggest that the energy density of the vacuum is about 10^-29 g/cm3.
Those numbers are difficult to reconcile. Indeed, they differ by 120 orders of magnitude. How and why this discrepancy arises is not known and is the cause of much bemused embarrassment among cosmologists.
This is a long standing problem, but recently a solution has suggested itself. The enormous number (which no one ever believed) comes from assuming that every Planck sized volume represents one degree of freedom for each elementary particle field. Because of Heisenberg's uncertainty the lowest energy possible for each degree of freedom is not zero, rather it is 1Planck (in natural units) and the energy density is (1Planck)^4 for each different elementary field which is some relatively small number (18?). But Bekenstein and Hawking showed that a black hole has degrees of freedom equal to the area of it's horizon in Planck units and this is maximal for the given mass-energy. Since then it is suspected that counting a degree of freedom for each Planck volume is over counting. The analog of the BH horizon in the Hubble sphere, the two-surface that's receding from us a c. If you assume that the degrees of freedom are one per Planck area on the Hubble sphere (proportional to area instead of volume), then the calculated energy density for the sphere is the right order of magnitude. So this is a possible solution that people are trying to make more precise.
Then there is the cosmological red-shift itself, which is another mystery. Physicists often talk about the red-shift as a kind of Doppler effect, like the change in frequency of a police siren as it passes by.
The Doppler effect arises from the relative movement of different objects. But the cosmological red-shift is different because galaxies are stationary in space. Instead, it is space itself that cosmologists think is expanding.
The mathematics that describes these effects is correspondingly different as well, not least because any relative velocity must always be less than the speed of light in conventional physics. And yet the velocity of expanding space can take any value.
Interestingly, the nature of the cosmological red-shift leads to the possibility of observational tests in the next few years. One interesting idea is that the red-shifts of distant objects must increase as they get further away. For a distant quasar, this change may be as much as one centimetre per second per year, something that may be observable with the next generation of extremely large telescopes.
One final paradox is also worth mentioning. This comes from one of the fundamental assumptions behind Einstein’s theory of general relativity—-that if you look at the universe on a large enough scale, it must be the same in all directions.
It seems clear that this assumption of homogeneity does not hold on the local scale. Our galaxy is part of a cluster known as the Local Group which is itself part of a bigger supercluster.
This suggests a kind of fractal structure to the universe. In other words, the universe is made up of clusters regardless of the scale at which you look at it.
To say "it seems clear" is a lot stronger than "this suggests". It may be the universe is homogenous at the level of superclusters. If not, then I think you need a specific model of it's fractal structure on which calculations can be based to compare with observations. The uniformity of the CMB would likely be upset by most fractal models.
The problem with this is that it contradicts one of the basic ideas of modern cosmology—-the Hubble law. This is the observation that the cosmological red-shift of an object is linearly proportional to its distance from Earth.
It is so profoundly embedded in modern cosmology that most currently accepted theories of universal expansion depend on its linear nature. That’s all okay if the universe is homogeneous (and therefore linear) on the largest scales.
But the evidence is paradoxical. Astrophysicists have measured the linear nature of the Hubble law at distances of a few hundred megaparsecs. And yet the clusters visible on those scales indicate the universe is not homogeneous on the scales.
And so the argument that the Hubble law’s linearity is a result of the homogeneity of the universe (or vice versa) does not stand up to scrutiny. Once again this is an embarrassing failure for modern cosmology.
It is sometimes tempting to think that astrophysicists have cosmology more or less sewn up, that the Big Bang model, and all that it implies, accounts for everything we see in the cosmos.
Not even close. Cosmologists may have successfully papered over the cracks in their theories in a way that keeps scientists happy for the time being. This sense of success is surely an illusion.
And that is how it should be. If scientists really think they are coming close to a final and complete description of reality, then a simple list of paradoxes can do a remarkable job of putting feet firmly back on the ground.
I think a much more immediate problem is the nature of dark matter. On the more purely theoretical front there seems to be a direct contradiction (not just a paradox) between the equivalence principle of general relativity and quantum mechanics. The former says that falling through horizon of a large black hole would not even be noticeable. The latter says something drastic must happen there to maintain unitary evolution of quantum state.
Ref: arxiv.org/abs/1501.01919 : Paradoxes Of Cosmological Physics In The Beginning Of The 21-St Century
Thanks for posting it.
Brent Meeker
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
<<What’s more, there is an energy associated with any given volume of the universe. If that volume increases, the inescapable conclusion is that this energy must increase as well. And yet physicists generally think that energy creation is forbidden.>> This one is easy. The cosmological constant term, Λ g[μν], in Einstein's equation, if interpreted as a stress-energy tensor, implies an energy density u and a pressure P satisfying P = -u. The work done on a body at pressure P when its volume increases by ΔV is -P ΔV. This work increases the energy content of the body by ΔE = -P ΔV, which equals the increase u ΔV in the vacuum energy. <<One final paradox is also worth mentioning. This comes from one of the fundamental assumptions behind Einstein’s theory of general relativity—-that if you look at the universe on a large enough scale, it must be the same in all directions.>> This is not true. The homogeneity of the universe is not a fundamental assumption of general relativity. It is merely a mathematical assumption that leads to simpler solutions of cosmological models. <<The nature of the energy associated with the vacuum is another puzzle. This is variously called the zero point energy or the energy of the Planck vacuum and quantum physicists have spent some time attempting to calculate it. These calculations suggest that the energy density of the vacuum is huge, of the order of 10^94 g/cm^3. This energy, being equivalent to mass, ought to have a gravitational effect on the universe. Cosmologists have looked for this gravitational effect and calculated its value from their observations (they call it the cosmological constant). These calculations suggest that the energy density of the vacuum is about 10^-29 g/cm3. Those numbers are difficult to reconcile. Indeed, they differ by 120 orders of magnitude. How and why this discrepancy arises is not known and is the cause of much bemused embarrassment among cosmologists.>> The universe is what it is. Somebody made an incorrect assumption as to the nature of this dark energy. -- Gene
participants (4)
-
Charles Greathouse -
Eugene Salamin -
Henry Baker -
meekerdb