How do we know the fundamental constants are constant? We don't.

This illustration shows the three basic steps astronomers use to calculate how fast the universe expands over time, a value called the Hubble constant. All the steps involve building a strong "cosmic distance ladder," by starting with measuring accurate distances to nearby galaxies and then moving to galaxies farther and farther away. This ladder is a series of measurements of different kinds of astronomical objects with an intrinsic brightness that researchers can use to calculate distances.
This illustration shows the three basic steps astronomers use to calculate how fast the universe expands over time, a value called the Hubble constant. All the steps involve building a strong "cosmic distance ladder," by starting with measuring accurate distances to nearby galaxies and then moving to galaxies farther and farther away. This ladder is a series of measurements of different kinds of astronomical objects with an intrinsic brightness that researchers can use to calculate distances. (Image credit: NASA, ESA, and A. Feild (STScI))

Through a variety of tests on Earth and throughout the universe, physicists have measured no changes in time or space for any of the fundamental constants of nature. 

All of modern physics rests on two main pillars. One is Einstein's theory of general relativity, which we use to explain the force of gravity. The other is the Standard Model, which we use to describe the other three forces of nature: electromagnetism, the strong nuclear force and the weak nuclear force. Wielding these theories, physicists can explain vast swaths of interactions throughout the universe.

But those theories do not fully explain themselves. Appearing within the equations are fundamental constants, which are numbers that we must measure independently and plug in by hand. Only with these numbers in place can we use the theories to make new predictions. General relativity depends on only two constants: the strength of gravity (commonly called G) and the cosmological constant (usually denoted by Λ, which measures the amount of energy in the vacuum of space-time).

Related: The problems with modern physics

The Standard Model requires 19 constants to plug into the equations. These include parameters such as the masses of nine fermions (like the electron and the up quark), the strengths of the nuclear forces, and constants that control how the Higgs boson interacts with other particles. Because the Standard Model does not automatically predict the masses of the neutrinos, to include all their dynamics we have to add seven more constants.

That's 28 numbers that completely determine all the physics of the known universe.

 Not so constant 

Many physicists argue that having all these constants seems a little artificial. Our job as scientists is to explain as many varied phenomena as possible with as few starting assumptions as we can get away with. Physicists believe that general relativity and the Standard Model are not the end of the story, however, especially since these two theories are not compatible with each other. They suspect that there is some deeper, more fundamental theory that unites these two branches.

That more fundamental theory could have any number of fundamental constants associated with it. It could have the same set of 28 we see today. It could have its own, independent constants, with the 28 appearing as dynamic expressions of some underlying physics. It could even have no constants at all, with the fundamental theory able to explain itself in its entirety with nothing having to be added by hand.

No matter what, if our fundamental constants aren't really constant — if they happen to vary across time or space — then that would be a sign of physics beyond what we currently know. And by measuring those variations, we could get some clues as to a more fundamental theory.

And physicists have devised a number of experiments to test the constancy of those constants.

Constants to the test 

One test involves ultraprecise atomic clocks. The operation of an atomic clock depends on the strength of the electromagnetic interaction, the mass of the electron, and the spin of the proton. Comparing clocks at different locations or observing the same clock for long periods of time can reveal if any of those constants change.

Another ingenious test involves the Oklo uranium mine in Gabon. Two billion years ago, the site acted as a natural nuclear reactor that operated for a few million years. If any of the fundamental constants were different back then, the products of that radioactive process, which survive to the present day, would be different than expected.

Looking at larger scales, astronomers have studied the light emitted by quasars, which are ultraluminous objects powered by black holes sitting billions of light-years away from us. The light from those quasars had to travel those enormous distances to reach us, and they passed through innumerable gas clouds that absorbed some of that light. If fundamental constants were different throughout the universe, then that absorption would be altered and quasars in one direction would look subtly different from quasars in other directions.

At the very largest scales, physicists can use the Big Bang itself as a laboratory. They can use our knowledge of nuclear physics to predict the abundance of hydrogen and helium produced in the first dozen minutes of the Big Bang. And they can use plasma physics to predict the properties of the light emitted when our universe cooled from a plasma to a neutral gas when it was 380,000 years old. If the fundamental constants were different long ago, then it would show up as a mismatch between theory and observation.

In these experiments and more, nobody has ever observed any variation in the fundamental constants. We can't completely rule it out, but we can place incredibly stringent limits on their possible changes. For example, we know that the fine structure constant, which measures the strength of the electromagnetic interaction, is the same throughout the universe to 1 part per billion. 

While physicists continue to search for a new theory to replace the Standard Model and general relativity, it appears that the constants we know and love are here to stay.

Follow us on Twitter @Spacedotcom or on Facebook.

Join our Space Forums to keep talking space on the latest missions, night sky and more! And if you have a news tip, correction or comment, let us know at: community@space.com.

Paul Sutter
Space.com Contributor

Paul M. Sutter is an astrophysicist at SUNY Stony Brook and the Flatiron Institute in New York City. Paul received his PhD in Physics from the University of Illinois at Urbana-Champaign in 2011, and spent three years at the Paris Institute of Astrophysics, followed by a research fellowship in Trieste, Italy, His research focuses on many diverse topics, from the emptiest regions of the universe to the earliest moments of the Big Bang to the hunt for the first stars. As an "Agent to the Stars," Paul has passionately engaged the public in science outreach for several years. He is the host of the popular "Ask a Spaceman!" podcast, author of "Your Place in the Universe" and "How to Die in Space" and he frequently appears on TV — including on The Weather Channel, for which he serves as Official Space Specialist.

  • rod
    "That's 28 numbers that completely determine all the physics of the known universe."

    My note. Where did these constants in nature come from and how and when did the Big Bang create the constants seen in nature today? The Big Bang model has the universe we see today evolving from an area the size of an electron or smaller - in the beginning.

    "But those theories do not fully explain themselves. Appearing within the equations are fundamental constants, which are numbers that we must measure independently and plug in by hand. Only with these numbers in place can we use the theories to make new predictions. General relativity depends on only two constants: the strength of gravity (commonly called G) and the cosmological constant (usually denoted by Λ, which measures the amount of energy in the vacuum of space-time)."

    My note. What? How did the Big Bang create these two constants and avoid two fatal values that destroys the universe - in the beginning?
    Reply
  • Pentcho Valev
    The speed of light is not constant. It is VARIABLE, as posited by Newton's theory and proved by the Michelson-Morley experiment in 1887 (prior to the introduction of the length-contraction fudge factor):

    Wikipedia: "Emission theory, also called emitter theory or ballistic theory of light, was a competing theory for the special theory of relativity, explaining the results of the Michelson–Morley experiment of 1887...The name most often associated with emission theory is Isaac Newton. In his corpuscular theory Newton visualized light "corpuscles" being thrown off from hot bodies at a nominal speed of c with respect to the emitting object, and obeying the usual laws of Newtonian mechanics, and we then expect light to be moving towards us with a speed that is offset by the speed of the distant emitter (c ± v)."

    Banesh Hoffmann, Einstein's co-author, admits that, originally ("without recourse to contracting lengths, local time, or Lorentz transformations"), the Michelson-Morley experiment was compatible with Newton's variable speed of light, c'=c±v, and incompatible with the constant speed of light, c'=c:

    "Moreover, if light consists of particles, as Einstein had suggested in his paper submitted just thirteen weeks before this one, the second principle seems absurd: A stone thrown from a speeding train can do far more damage than one thrown from a train at rest; the speed of the particle is not independent of the motion of the object emitting it. And if we take light to consist of particles and assume that these particles obey Newton's laws, they will conform to Newtonian relativity and thus automatically account for the null result of the Michelson-Morley experiment without recourse to contracting lengths, local time, or Lorentz transformations. Yet, as we have seen, Einstein resisted the temptation to account for the null result in terms of particles of light and simple, familiar Newtonian ideas, and introduced as his second postulate something that was more or less obvious when thought of in terms of waves in an ether." Banesh Hoffmann, Relativity and Its Roots, p.92
    Reply
  • Lara
    I can all but guarantee the big bang will be thrown out entirely in the next couple years.

    Variable Mass theory is as robust as the standard model and it's shortcoming is one the standard model also has.

    We already know that matter decays or strongly suspect that it does.

    The theory explains so many of the "how can it be so big" "why so complex so early" and other proof alarms that should have been telling people for decades that they were just padding a theory that itself was MASSIVELY padded from the start with renormalization.
    Reply
  • MikeCollinsJr
    Paul the Writer, Sir, you have left out modern society's most valuable constant, that which identifies and differentiates for us Magnetism! Let's band together and launch a new television network for all, the mw. The Moonwatchers' Network! Get tuned!
    Reply
  • MikeCollinsJr
    Admin said:
    Physicists have measured no changes in time or space for any of the fundamental constants of nature.

    How do we know the fundamental constants are constant? We don't. : Read more
    As for "why stars look spiky" in the JWST images, I am happy to report here that the unique phenom of JWST is not only about "brightness", but about identifying for image-viewers and astronomy-learners where H2 is most present within the frame; happy because it allows me to prove, with evidence, one utility of the Magnetism constant: the resonance of JWST's 18 Be-mirrors matches unequivocally the order of H2. To be within reach of image analysts and processors, even if in post, could someday offer a toggle control where image-viewers can allow the spikes to be pulled down and converted to real-image resolution for revealing areas such as H2O H2O resolution]. In other words, with such a filter, the spikes may rearrange to further spatial perception; can JWST be calibrated to "think" this way, too?] I think the beautifully elegant and simple resolution (and spikes) of the Triton+Neptune image, where Triton is described as "icy", clearly relates this for astro-learners: where H2 is way up, so too is the possibility of H2O, thanks to the Be-design of the quantity and orientation of the mirrors.
    Reply
  • billslugg
    The image spikes are an unavoidable consequence of diffraction around the mirror edges. It is not possible for any algorithm to convert them into usable data. Basic optical theory.
    Reply
  • Helio
    rod said:
    "That's 28 numbers that completely determine all the physics of the known universe."

    My note. Where did these constants in nature come from and how and when did the Big Bang create the constants seen in nature today? The Big Bang model has the universe we see today evolving from an area the size of an electron or smaller - in the beginning.

    "But those theories do not fully explain themselves. Appearing within the equations are fundamental constants, which are numbers that we must measure independently and plug in by hand. Only with these numbers in place can we use the theories to make new predictions. General relativity depends on only two constants: the strength of gravity (commonly called G) and the cosmological constant (usually denoted by Λ, which measures the amount of energy in the vacuum of space-time)."

    My note. What? How did the Big Bang create these two constants and avoid two fatal values that destroys the universe - in the beginning?
    Yes. Your addressing the fine-tuning view of the universe, where all the 28, or more, dials were amazingly set just right at the birth of the Universe.

    The rising alternative is the Multiverse. Mathematics shows a possible range of 10^600 other universes, hence a few could be found like ours. But math is not physics. Is this view testable objectively? Well, one author says there are 6 tests, but only mentions two — two predicted void sizes in the CMBR. But these voids may, IMO, have alternate explanations since they are harmonics of the size of the universe (monopole & dipole, IIRC).
    Reply
  • MikeCollinsJr
    billslugg said:
    The image spikes are an unavoidable consequence of diffraction around the mirror edges. It is not possible for any algorithm to convert them into usable data. Basic optical theory.
    Hi Billslugg, I have just provided a conversion scenario. Where did you learn basic optical theory ? If the diffraction is caused by the resonance of the Berrylium, where motion control is part of the image capture process, these spikes can be "directed" to where they should be, ie. resolving the natural mass and light at the focal length.
    Reply
  • billslugg
    I studied optical theory, image processing and communications theory at Pennsylvania State University 1970-1974 while taking a Bachelor's Degree in Electrical Engineering.

    There is an iron clad, immutable, 100% true all of the time theorem in communications theory which applies to images, data sets, recordings: "No amount of processing can ever add information." You can alter an image and make it look better, you could even remove the spikes if you wanted to, but it does not add information.

    Diffraction is not caused by "the resonance of the beryllium", it is caused by the shape of the mirrors. It is an unavoidable consequence of the passage of any wave function through any system. If the mirrors were round then the diffraction "spikes" would be concentric circles. This is what limits the performance of any optical system. There is no way around it except by making the mirrors larger or going to shorter wavelengths.
    Reply
  • rod
    Post #5 jarred my old brain :) Google says, "The magnetic constant μ0 (also known as vacuum permeability or permeability of free space) is a universal physical constant, relating mechanical and electromagnetic units of measurement.Dec 19, 2010"

    I looked over my copy of Allen's Astrophysical Quantities, Fourth Edition, 2000. This text has numerous constants used in science today. General Constants and Units, page 7. Is there a published, System of Record disclosing all? :) It seems the more constants used in science to describe nature and How the Universe Works, the more fine-tuning problems appear or as Helio indicates in post #8, the Mulitverse must be embraced to avoid fine-tuning problems (that could be used to point to special creation, not random acts of nature as the 1st cause).
    Reply