Credit: Starry Night Software
A glance at the night sky above Earth shows that some stars are much brighter than others. However, the brightness of a star depends on its composition and how far it is from the planet.
Astronomers define star brightness in terms of apparent magnitude (how bright the star appears from Earth) and absolute magnitude (how bright the star appears at a standard distance of 32.6 light years, or 10 parsecs). Astronomers also measure luminosity — the amount of energy (light) that a star emits from its surface.
Measuring star brightness is an ancient idea, but today astronomers use more precise tools to obtain the calculation.
From Greek to modern times
More than 2,000 years ago, the Greek astronomer Hipparchus was the first to make a catalog of stars according to their brightness, according to Dave Rothstein, who participated in Cornell University's "Ask An Astronomer" website in 2003.
"Basically, he looked at the stars in the sky and classified them by how bright they appear — the brightest stars were 'magnitude 1', the next brightest were 'magnitude 2', etc., down to 'magnitude 6', which were the faintest stars he could see," Rothstein wrote.
Human eyes, however are not very discerning. Large differences in brightness actually appear much smaller using this scale, Rothstein said. Light-sensitive charged-coupled devices (CCDs) inside digital cameras measure the amount of light coming from stars, and can provide a more precise definition of brightness.
Using this scale, astronomers now define five magnitudes' difference as having a brightness ratio of 100. Vega was used as the reference star for the scale. Initially it had a magnitude of 0, but more precise instrumentation changed that to 0.3.
Apparent magnitude vs. absolute magnitude
When taking Earth as a reference point, however, the scale of magnitude fails to account for the true differences in brightness between stars. The apparent brightness, or apparent magnitude, depends on the location of the observer. Different observers will come up with a different measurement, depending on their locations and distance from the star. Stars that are closer to Earth, but fainter, could appear brighter than far more luminous ones that are far away.
"It is the 'true' brightness — with the distance dependence factored out — that is of most interest to us as astronomers," stated an online course on astronomy from the University of Tennessee.
"Therefore, it is useful to establish a convention whereby we can compare two stars on the same footing, without variations in brightness due to differing distances complicating the issue."
Credit: ESO/Digitized Sky Survey 2
The solution was to implement an absolute magnitude scale to provide a reference between stars. To do so, astronomers calculate the brightness of stars as they would appear if it were 32.6 light-years, or 10 parsecs from Earth.
Another measure of brightness is luminosity, which is the power of a star — the amount of energy (light) that a star emits from its surface. It is usually expressed in watts and measured in terms of the luminosity of the sun. For example, the sun's luminosity is 400 trillion trillion watts. One of the closest stars to Earth, Alpha Centauri A, is about 1.3 times as luminous as the sun.
To figure out luminosity from absolute magnitude, one must calculate that a difference of five on the absolute magnitude scale is equivalent to a factor of 100 on the luminosity scale — for instance, a star with an absolute magnitude of 1 is 100 times as luminous as a star with an absolute magnitude of 6.
Limitations of absolute magnitude
While the absolute magnitude scale is astronomers' best effort to compare the brightness of stars, there are a couple of main limitations hat have to do with the instruments that are used to measure it.
First, astronomers must define which wavelength of light they are using to make the measurement. Stars can emit radiation in forms ranging from high-energy X-rays to low-energy infrared radiation. Depending on the type of star, they could be bright in some of these wavelengths and dimmer in others.
To address this, scientists must specify which wavelength they are using to make the absolute magnitude measurements.
Another key limitation is the sensitivity of the instrument used to make the measurement. In general, as computers have advanced and telescope mirror technology has improved over the years, measurements that are made in recent years have more weight among scientists than those that are made long ago.
Paradoxically, the brightest stars are among the least studied by astronomers, but there is at least one recent effort to catalog their luminosity. A constellation of satellites called BRITE (BRight Target Explorer) will measure the variability of brightness between stars. Participants in the six-satellite project include Austria, Canada and Poland. The first two satellites launched successfully in 2013.
Top 10 brightest stars, as seen from Earth
|
Common name |
Constellation |
Apparent Magnitude |
Absolute Magnitude |
Distance from Earth |
| Sun | n/a | -26.72 | 4.2 | 93 million miles |
| Sirius | Canis Major | -1.46 | 1.4 | 8.6 light-years |
| Canopus | Carina | -0.72 | -2.5 | 74 light-years |
| Rigil Kentaurus | Centaurus | -0.27 | 4.4 | 4.3 light-years |
| Arcturus | Boötes | -0.04 | 0.2 | 34 light-years |
| Vega | Lyra | 0.03 | 0.6 | 25 light-years |
| Capella | Auriga | 0.08 | 0.4 | 41 light-years |
| Rigel | Orion | 0.12 | -8.1 | 1,400 light-years |
| Procyon | Canis Minor | 0.38 | 2.6 | 11.4 light-years |
| Achernar | Eridanus | 0.46 | -1.3 | 69 light-years |
| (Source: Chris Dolan, University of Wisconsin-Madison Department of Astronomy. He adapted it from Norton's 2000.0, 18th edition (1989) along with Bill Baity's Sky Pages.) | ||||
— Elizabeth Howell, SPACE.com Contributor



