'Standard model' of cosmology called into question by new measurements

New measurements have allowed scientists to alter the Hubble Constant and change our perception of the expanding universe.  (Image credit: Sophia Dagnello, NRAO/AUI/NSF)

Do scientists need to re-examine the basic model of the universe? 

Using new measurements of cosmic distances, astronomers have refined their calculation of the Hubble Constant, a number that describes how fast the universe is expanding at different distances from a specific point in space. 

The new measurements have prompted scientists to consider revising this important figure and provided support for the idea that the "standard model of cosmology," the theory that describes the fundamental nature of the universe, may need to be revamped.

These new measurements, made using a variety of telescopes around the world, emphasize a discrepancy between previous measurements of the Hubble Constant and the value of this constant predicted by the "standard model," astronomers said. With the "standard model of cosmology," the universe is thought to be about 5% ordinary matter, 27% dark matter and 68% dark energy.

Related: The biggest unsolved mysteries in physics

In this new work, researchers refined distance measurements to four different galaxies (which are at distances ranging from 168 million light-years to 431 million light-years from Earth) and additional, previous distance measurements to two galaxies. They found the Hubble Constant, or the rate that the universe is expanding, to be 73.9 kilometers (45.9 miles) per second per megaparsec.

This is fairly different from 67.4 kilometers (41.9 miles) per second per megaparsec, the value predicted from the standard model.

“Testing the standard model of cosmology is a really challenging problem that requires the best-ever measurements of the Hubble Constant," Dom Pesce, a researcher at the Center for Astrophysics at Harvard and Smithsonian and lead author on the new paper, said in a statement

"The discrepancy between the predicted and measured values of the Hubble Constant points to one of the most fundamental problems in all of physics, so we would like to have multiple, independent measurements that corroborate the problem and test the model," Pesce said. "Our method is geometric and completely independent of all others, and it reinforces the discrepancy."

These new measurements were made by telescopes including the National Science Foundation’s Very Long Baseline Array (VLBA), Karl G. Jansky Very Large Array (VLA), and Robert C. Byrd Green Bank Telescope (GBT), along with the Effelsberg telescope in Germany. 

“We find that galaxies are nearer than predicted by the standard model of cosmology, corroborating a problem identified in other types of distance measurements," James Braatz of the National Radio Astronomy Observatory (NRAO), who leads the Megamaser Cosmology Project, which aims to measure the Hubble Constant, said in the same statement.

"There has been debate over whether this problem lies in the model itself or in the measurements used to test it," Braatz added. "Our work uses a distance measurement technique completely independent of all others, and we reinforce the disparity between measured and predicted values. It is likely that the basic cosmological model involved in the predictions is the problem."

This work was published in the journal the Astrophysical Journal Letters. 

Email Chelsea Gohd at cgohd@space.com or follow her on Twitter @chelsea_gohd. Follow us on Twitter @Spacedotcom and on Facebook.

OFFER: Save 45% on 'All About Space' 'How it Works' and 'All About History'!

<a href="https://www.space.com/your-favorite-magazines-space-science-deal-discount.html" data-link-merchant="space.com"" target="_blank">OFFER: Save 45% on 'All About Space' 'How it Works' and 'All About History'!

For a limited time, you can take out a digital subscription to any of <a href="https://www.space.com/your-favorite-magazines-space-science-deal-discount.html" data-link-merchant="space.com"" data-link-merchant="space.com"" target="_blank">our best-selling science magazines for just $2.38 per month, or 45% off the standard price for the first three months.

Join our Space Forums to keep talking space on the latest missions, night sky and more! And if you have a news tip, correction or comment, let us know at: community@space.com.

Chelsea Gohd
Senior Writer

Chelsea “Foxanne” Gohd joined Space.com in 2018 and is now a Senior Writer, writing about everything from climate change to planetary science and human spaceflight in both articles and on-camera in videos. With a degree in Public Health and biological sciences, Chelsea has written and worked for institutions including the American Museum of Natural History, Scientific American, Discover Magazine Blog, Astronomy Magazine and Live Science. When not writing, editing or filming something space-y, Chelsea "Foxanne" Gohd is writing music and performing as Foxanne, even launching a song to space in 2021 with Inspiration4. You can follow her on Twitter @chelsea_gohd and @foxannemusic.

  • rod
    The space.com report calls attention to an important point here.

    "There has been debate over whether this problem lies in the model itself or in the measurements used to test it," Braatz added. "Our work uses a distance measurement technique completely independent of all others, and we reinforce the disparity between measured and predicted values. It is likely that the basic cosmological model involved in the predictions is the problem."

    My note, H0 = 74 plugged into the cosmology calculator, http://www.astro.ucla.edu/~wright/CosmoCalc.html, the flat universe is 12.905E+9 years old, open model is 10.746E+9 years old. Much is at stake with these different H0 values now. The Hubble time for the age of the universe can be reduced quite a bit from the conventional 13.8E+9 years old universe commonly reported.

    "Our measurement of the Hubble Constant is very close to other recent measurements, and statistically very different from the predictions based on the CMB and the standard cosmological model. All indications are that the standard model needs revision," said Braatz., ref, https://phys.org/news/2020-06-distance-bolster-basic-universe.html
    Looks like the LCDM model used in BB cosmology needs more constraints now, including the age of the universe, the Hubble time.
    Reply
  • Helio
    The word "acceleration" wasn't used in the article. With an accelerating universe, wouldn't the redshifts indicate today that we are expanding faster than we were in the past, so that an effective expansion rate would be less after all, right?
    Reply
  • scvblwxq
    Maybe the problem is in the gravitational portion that their "standard model" doesn't include.
    Reply
  • scvblwxq
    Is the "standard model" of cosmology the same as the "standard model" of particle physics?
    Reply
  • Mm70rj
    Helio said:
    The word "acceleration" wasn't used in the article. With an accelerating universe, wouldn't the redshifts indicate today that we are expanding faster than we were in the past, so that an effective expansion rate would be less after all, right?
    I agree!
    But that's the key point that support the "Dark Matter" cult...
    I find funny that it is easier to believe in Dark Matter than that the Universe (or part of it) it is retracting into the original singularity (Big Bang or multiple Big Bangs)...
    Reply
  • Torbjorn Larsson
    This comes over to the naive reader as a sort of ridiculous paper of the Reiss collaboration, the collaboration that pushes the “the standard model needs revision” based on thin gruel. For one, the work has low statistical power of few galaxies. For another, it is a really complex modeling they improve slightly on.

    But foremost these galaxies are really close and not telling us much.

    Meanwhile, as recently mentioned here, the “Most Precise Tests of Dark Energy and Cosmic Expansion Yet Confirm the Model of a Spatially Flat Universe” and also promise to explain what is going on https://scitechdaily.com/most-precise-tests-of-dark-energy-and-cosmic-expansion-yet-confirm-the-model-of-a-spatially-flat-universe/ ]:

    “The study makes use of data from over a million galaxies and quasars gathered over more than a decade of operations by the Sloan Digital Sky Survey.

    The results confirm the model of a cosmological constant dark energy and spatially flat Universe to unprecedented accuracy, and strongly disfavour recent suggestions of positive spatial curvature inferred from measurements of the cosmic microwave background (CMB) by the Planck satellite.”

    “”We see tentative evidence that data from relatively nearby voids and BAO favour the high Hubble rate seen from other low-redshift methods, but including data from more distant quasar absorption lines brings it in better agreement with the value inferred from Planck CMB data.”

    That is statistical support of 6 galaxies within z = 0.03 vs 1,000,000+ galaxies within z = 2.3 (or half the universe age).

    The new void measurement gives the current expansion rate of 69 +/- 1.2 km s^-1 Mpc^-1 so is < 2 % imprecise. It is using BAO as a ladder independent distance ruler. "BAO matter clustering provides a "standard ruler" for length scale in cosmology." https://en.wikipedia.org/wiki/Baryon_acoustic_oscillations ] That also means it likely – though not assuredly – lies within 70 km s^-1 Mpc^-1, which IIRC is the limit where the Planck collaboration suspected that we needed new physics.

    The new maser measurement is twice as imprecise, to add to the low statistical support. And it is still using local distance ladder type measurements, which we know have 5-15 % errors for nearby galaxies https://en.wikipedia.org/wiki/Cosmic_distance_ladder ]. That seems to be the problem.

    We'll see how further measurements and reception among involved scientists turn out. But the void measurement gives me hope that we may not have to wait for long until the question is more satisfactorily resolved.
    Reply
  • Torbjorn Larsson
    rod said:
    "Our measurement of the Hubble Constant is very close to other recent measurements, and statistically very different from the predictions based on the CMB and the standard cosmological model. All indications are that the standard model needs revision," said Braatz., ref, https://phys.org/news/2020-06-distance-bolster-basic-universe.html
    Looks like the LCDM model used in BB cosmology needs more constraints now, including the age of the universe, the Hubble time.

    The current open question is a tension between the mostly low z local universe observations of current expansion rate and the mostly high z distant universe observations. The integrative observations over several measurement types also scatter over that tension, but there is also here a tendency that the global integrative observations prefer a lower expansion rate. The local universe measurements often rely on the cosmic distance ladder, which is an iffy construction - see more in my longish comment here.

    This happens on a context where - before the supernova cosmic background ladder measurements, and later the cosmic background spectra observations - there was an open question about which type of cosmology applied and the universe age estimates differed with a factor 2 . For the younger ages there were stars that from astrophysicist modeling appeared to be older than the universe, which is a much more serious tension. As I often note, the LCDM cosmology solved these problems and is self consistent - apart from this single remaining open tension which did not appeared until the last few years at that. Because historically, as in similar cases such as measuring the universal speed limit, early observations are iff and have a lot of bias. (C.f. how, famously, the modern value of the universal speed limit lies outside the initial range.) See the image here: https://sci.esa.int/web/planck/-/60504-measurements-of-the-hubble-constant .

    The simplest explanation is that something similar is going on here, though we should of course mind that there are other explanations. The current most precise and statistically well supported expansion rate measurement - see my longish comment - indicates that the expansion rate "likely – though not assuredly – lies within 70 km s^-1 Mpc^-1, which IIRC is the limit where the Planck collaboration suspected that we needed new physics." Only if we get a higher rate, would LCDM need to be modified and, yes, becoming less simple. The age of the universe would not move so much that stars would seem older than the universe, I think - LCDM is "morally" right (simple but explaining so much; having no real contenders), even if details are under scrutiny.
    Reply
  • Torbjorn Larsson
    Helio said:

    The word "acceleration" wasn't used in the article. With an accelerating universe, wouldn't the redshifts indicate today that we are expanding faster than we were in the past, so that an effective expansion rate would be less after all, right?

    Technically the H0 Hubble constant stands for the rate at the current . It applies in the equation v = H_0*D, where D is the "proper distance" to a sufficiently distant galaxy you look at and v is the separation speed between Milky Way and that galaxy https://en.wikipedia.org/wiki/Hubble%27s_law ].

    When we apply that to the universe we find that the expansion of a general relativistic universe depends on the inner state of the universe, so the Hubble "constant" is actually a function H(a) where a = scale factor of the expansion https://en.wikipedia.org/wiki/Lambda-CDM_model#Cosmic_expansion_history ]. The current expansion behavior is very simple since dark energy, a constant energy density, dominates the inner state, and we get an exponential expansion since H(a) = H_0.

    I think what you call that exponential the "effective expansion rate". Astronomers takes great care in extracting H_0 out of the former expansion history. But of course, if they don't understand the differing behaviors sufficiently they will run into ambiguities.

    Reply
  • Torbjorn Larsson
    scvblwxq said:

    Maybe the problem is in the gravitational portion that their "standard model" doesn't include.

    scvblwxq said:

    Is the "standard model" of cosmology the same as the "standard model" of particle physics?

    The first question is hard to respond to, because if new physics is needed (and we don't know that) it affects the general relativistic physics that our cosmology is. The gravitational portion is built in, and changes in that is more difficult to envision than changes in the current components. For example, different expansion rates may depend on new, temporary matter particles in the early universe (and see my response to Helios that describe some of the physics behind expansion rates).

    The terminology is appealing to cosmologists, who covers both since high energy physics implies particle physics to various extent. There are other names, but essentially they describe theories that are well tested today, but which has some obvious or threatening remaining tensions between observations and/or theory - expansion rate for LCDM "standard model" (statistically significant) and lepton flavor universality for Higgs "standard model https://atlas.cern/updates/physics-briefing/addressing-long-standing-tension-standard-model ] (not sufficiently significant, just long standing).
    Reply
  • Torbjorn Larsson
    Mm70rj said:

    But that's the key point that support the "Dark Matter" cult...

    I find funny that it is easier to believe in Dark Matter than that the Universe (or part of it) it is retracting into the original singularity (Big Bang or multiple Big Bangs)...

    Key point? "Cult"?

    "... offers a comprehensive explanation for a broad range of observed phenomena, including the abundance of light elements, the cosmic microwave background (CMB) radiation, large-scale structure, and Hubble's law ... now universally accepted." https://en.wikipedia.org/wiki/Big_Bang ]
    Singularity?

    P1Q8tS-9hYoView: https://www.youtube.com/watch?v=P1Q8tS-9hYo
    Reply