A glance at the night sky above Earth shows that some stars are much brighter than others. However, the brightness of a star depends on its composition and how far it is from the planet.
Astronomers define star brightness in terms of apparent magnitude — how bright the star appears from Earth — and absolute magnitude — how bright the star appears at a standard distance of 32.6 light-years, or 10 parsecs. (A light-year is the distance light travels in one year — about 6 trillion miles, or 10 trillion kilometers.) Astronomers also measure luminosity — the amount of energy (light) that a star emits from its surface.
Measuring star brightness is an ancient idea, but today astronomers use more precise tools to obtain the calculation. Specifically, they use the electromagnetic spectrum, focusing on the wavelengths visible to the human eyes. Every star has a distinctive spectrum of light, which to our eyes is rendered as a color.
"It turns out that a star's color spectrum is a good indication of its actual brightness," HowStuffWorks says (opens in new tab). "Astronomers can therefore look at a distant star and determine its color spectrum. From the color, they can determine the star's actual brightness. By knowing the actual brightness and comparing it to the apparent brightness seen from Earth — that is, by looking at how dim the star has become once its reaches Earth — they can determine the distance to the star."
More than 2,000 years ago, the Greek astronomer Hipparchus was the first to make a catalog of stars according to their brightness during the 2nd century BCE. Hipparchus was also interested in the large-scale evolution of the sky, and worked to understand how stars fit into his framework.
Our current-day magnitude scale is based upon one established by the Roman astronomer Claudius Ptolemy, who created a star catalog in the 2nd century CE. (Ptolemy is more famous for creating an Earth-centered model of the universe, based on his interpretation of planetary movements with the naked eye; it was largely accepted by the astronomical community up to the invention of the telescope.)
Ptolemy's catalog listed stars from brightest (which he deemed first magnitude) to dimmest (sixth magnitude). Ptolemy, however, was limited in that he could only look at the brightness of stars with the naked eye. The telescope, introduced in the 17th century, revealed far more stars than what the human eye can see.
Writing in 1610, early telescopic observer Galileo Galilei was one of the first recorded observers to talk about the power of telescopic observations; he used the telescope to observe craters on the moon and moons circling Jupiter, among other things.
"Indeed, with the glass you will detect below stars of the sixth magnitude such a crowd of others that escape natural sight that it is hardly believable," he said in Sidereus Nuncius (The Starry Messenger). "The largest of these . . . we may designate as of the seventh magnitude."
Rapidly evolving telescopes quickly allowed astronomers to see dimmer and dimmer objects, and today professional telescopes can be quite sensitive indeed. "Today a pair of 50-millimeter binoculars will show stars of about 9th magnitude, a 6-inch amateur telescope will reach to 13th magnitude, and the Hubble Space Telescope has seen objects as faint as 31st magnitude," Sky & Telescope said (opens in new tab).
Because it's hard to judge the difference between magnitudes using the naked eye, in the 19th century astronomers began to discuss tools to more precisely measure magnitude.
"They had already determined that a 1st-magnitude star shines with about 100 times the light of a 6th-magnitude star," Sky & Telescope continued. "Accordingly, in 1856 the Oxford astronomer Norman R. Pogson proposed that a difference of five magnitudes be exactly defined as a brightness ratio of 100 to 1. This convenient rule was quickly adopted."
Today we can measure this difference using a metric called "flux," which defines the rate at which energy reaches us from the star per unit area. Our modern-day tool of choice is light-sensitive charged-coupled devices (CCDs) inside digital cameras. Notably, a European Space Agency mission called Gaia is on a long-term quest to measure the brightness of stars through extreme precision in space, which will help keep our catalogs updated.
So now we can understand how the magnitude scale works in practice when looking at the sky. We use Vega as the reference star in measuring magnitude from long-standing convention; the star used to be defined at magnitude 0, but now is about 0.3 thanks to more precise measurements.
Most people with average vision can see objects as dim as sixth magnitude in dark locations. Stars can get as bright as nearly -1.5 magnitude, the International Space Station appears as bright as -6 magnitude, and the moon as bright as almost -13 magnitude. The sun, which is too bright to view safely with your bare eyes, is at nearly -27 magnitude.
When taking Earth as a reference point, the scale of magnitude fails to account for the true differences in brightness between stars. Apparent magnitude (or brightness) depends on the location of the observer.
Different observers will come up with a different measurement, depending on their locations and distance from the star. Stars or objects that are closer to Earth, but fainter, could appear brighter than far more luminous ones that are far away.
"The apparent magnitude of an object only tells us how bright an object appears from Earth. It does not tell us how bright the object is compared to other objects in the universe," Las Cumbres University stated.
"For example, from Earth the planet Venus appears brighter than any star in the sky. However, Venus is really much less bright than stars; it is just very close to us. Conversely, an object that appears very faint from Earth, may actually be very bright, but very far away. "
To be sure, it's okay to talk about apparent magnitude when speaking about astronomy on an amateur level. Most of us are limited to using binoculars or small telescopes and do not need to perform professional calculations to enjoy our observations. But professionals, studying stars in their context, use another metric called absolute magnitude.
Absolute magnitude, unlike apparent magnitude, allows us to provide a reference to compare stars. Absolute magnitude calculates the brightness of stars as they would appear if it were 32.6 light-years, or 10 parsecs from Earth.
While the absolute magnitude scale is astronomers' best effort to compare the brightness of stars, there are a couple of main limitations that have to do with the instruments that are used to measure it.
First, astronomers must define which wavelength of light they are using to make the measurement. Stars can emit radiation in forms ranging from high-energy X-rays to low-energy infrared radiation. Depending on the type of star, they could be bright in some of these wavelengths and dimmer in others.
To address this, scientists must specify which wavelength they are using to make the absolute magnitude measurements.
Another key limitation is the sensitivity of the instrument used to make the measurement. In general, as computers have advanced and telescope mirror technology has improved over the years, measurements that are made in recent years have more weight among scientists than those that are made long ago.
Paradoxically, the brightest stars are among the least studied by astronomers, but there is at least one recent effort to catalog their luminosity. A constellation of nanosatellites called BRITE (BRight Target Explorer) are measuring the variability of brightness between stars. Participants in the six-satellite project include Austria, Canada and Poland. Five of the satellites are still operational in 2022.
While many stars have a consistent brightness, there are more than 100,000 known and cataloged variable stars. (Even our own sun is variable, varying its energy output by about 0.1 percent, or one-thousandth of its magnitude, during its 11-year solar cycle.)
Variable stars can change over the short term or over the long term. Astronomers tend to talk much more about short-term variable stars, unless they are interested in learning about stellar evolution or cosmology (which is the study of the universe's history.)
Short-term variable stars come in two flavors. One is intrinsic, meaning their luminosity changes due to features such as expansion, contraction, eruption or pulsation. The second is extrinsic, meaning that a star or planet passes in front of the star and blocks the light, or that the change is due to stellar rotation.
There are lots of kinds of short-term variable stars. The most commonly cited one is Cepheid variables, which are extremely luminous stars that have short pulsation periods.
The variations in the luminosity allow astronomers to calculate how far away these Cepheids are, making them useful "measuring sticks" if the stars are embedded in galaxies or nebulae. Ultimately, this allows us to estimate the expansion of the universe by tracking the Hubble Constant, although scientists disagree on how fast the expansion is proceeding.
Other types of short-term variable stars include cataclysmic variables, which brighten due to outbursts such as during supernova explosions, or eruptic variables whose brightness varies during eruptions on the surface.
As for long-term variable stars, we know of at least a few stars that have changed brightness over many centuries. The North Star or Polaris, for example, could have been as much as 4.6 times brighter in ancient times than it was today. A 2014 study noted that the star dimmed for the past few decades, but then drastically brightened again.
Finally, there are stars that are variable for extrinsic reasons. Examples include eclipsing binary stars, when one star passes in front of another and temporarily dims the light of the furthest star from coming to Earth. Another example is a rotating star such as a pulsar. Pulsars are rapidly rotating cores of old stars that exploded into supernovas, whose electromagnetic radiation is only visible when the beam is directed at Earth.
Top 26 brightest stars
Below are the top 26 brightest stars as seen from Earth, with both their apparent magnitude and absolute magnitude listed.
Additional reading and resources
To learn more about the brightest stars that are currently visible in the night sky, check out EarthSky.org (opens in new tab). NASA also has a helpful guide on how to find good places to stargaze (opens in new tab).
- "What Is Absolute Magnitude?" Las Cumbres University. (n.d., accessed Jan. 23, 2022). https://lco.global/spacebook/distance/what-absolute-magnitude (opens in new tab)
- "How Do Astronomers Measure the Brightness of Stars?" Pandian, Jagadheep D. Ask An Astronomer: Cornell University. (June 17, 2005). http://curious.astro.cornell.edu/about-us/82-the-universe/stars-and-star-clusters/measuring-the-stars/391-how-do-astronomers-measure-the-brightness-of-stars-intermediate (opens in new tab)
- "The Stellar Magnitude System." Alan MacRobert. Sky&Telescope. (Aug. 1, 2006). https://skyandtelescope.org/astronomy-resources/the-stellar-magnitude-system (opens in new tab)
- "How are astronomers able to measure how far away a star is?" HowStuffWorks. (April 1, 2000). https://science.howstuffworks.com/question224.htm (opens in new tab)