This wide-field view of the sky around the bright star Alpha Centauri was created from photographic images forming part of the Digitized Sky Survey 2. The star appears so big just because of the scattering of light by the telescope's optics as well as in the photographic emulsion. Alpha Centauri is the closest star system to the Solar System. Image released Oct. 17, 2012.
Credit: ESO/Digitized Sky Survey 2
A glance at the night sky above Earth shows that some stars are much brighter than others. However, the brightness of a star depends on its composition and how far it is from the planet.
Astronomers define star brightness in terms of apparent magnitude (how bright the star appears from Earth) and absolute magnitude (how bright the star appears at a standard distance of 32.6 light years, or 10 parsecs). Astronomers also measure luminosity — the amount of energy (light) that a star emits from its surface.
Measuring star brightness is an ancient idea, but today astronomers use more precise tools to obtain the calculation.
From Greek to modern times
More than 2,000 years ago, the Greek astronomer Hipparchus was the first to make a catalog of stars according to their brightness, according to Dave Rothstein, who participated in Cornell University's "Ask An Astronomer" website in 2003.
"Basically, he looked at the stars in the sky and classified them by how bright they appear — the brightest stars were 'magnitude 1', the next brightest were 'magnitude 2', etc., down to 'magnitude 6', which were the faintest stars he could see," Rothstein wrote.
Human eyes, however are not very discerning. Large differences in brightness actually appear much smaller using this scale, Rothstein said. Light-sensitive charged-coupled devices (CCDs) inside digital cameras measure the amount of light coming from stars, and can provide a more precise definition of brightness.
Using this scale, astronomers now define five magnitudes' difference as having a brightness ratio of 100. Vega was used as the reference star for the scale. Initially it had a magnitude of 0, but more precise instrumentation changed that to 0.3.
Apparent magnitude vs. absolute magnitude
When taking Earth as a reference point, however, the scale of magnitude fails to account for the true differences in brightness between stars. The apparent brightness, or apparent magnitude, depends on the location of the observer. Different observers will come up with a different measurement, depending on their locations and distance from the star. Stars that are closer to Earth, but fainter, could appear brighter than far more luminous ones that are far away.
"It is the 'true' brightness — with the distance dependence factored out — that is of most interest to us as astronomers," stated an online course on astronomy from the University of Tennessee.
"Therefore, it is useful to establish a convention whereby we can compare two stars on the same footing, without variations in brightness due to differing distances complicating the issue."
The solution was to implement an absolute magnitude scale to provide a reference between stars. To do so, astronomers calculate the brightness of stars as they would appear if it were 32.6 light-years, or 10 parsecs from Earth.
Another measure of brightness is luminosity, which is the power of a star — the amount of energy (light) that a star emits from its surface. It is usually expressed in watts and measured in terms of the luminosity of the sun. For example, the sun's luminosity is 400 trillion trillion watts. One of the closest stars to Earth, Alpha Centauri A, is about 1.3 times as luminous as the sun.
To figure out luminosity from absolute magnitude, one must calculate that a difference of five on the absolute magnitude scale is equivalent to a factor of 100 on the luminosity scale — for instance, a star with an absolute magnitude of 1 is 100 times as luminous as a star with an absolute magnitude of 6.
Limitations of absolute magnitude
While the absolute magnitude scale is astronomers' best effort to compare the brightness of stars, there are a couple of main limitations hat have to do with the instruments that are used to measure it.
First, astronomers must define which wavelength of light they are using to make the measurement. Stars can emit radiation in forms ranging from high-energy X-rays to low-energy infrared radiation. Depending on the type of star, they could be bright in some of these wavelengths and dimmer in others.
To address this, scientists must specify which wavelength they are using to make the absolute magnitude measurements.
Another key limitation is the sensitivity of the instrument used to make the measurement. In general, as computers have advanced and telescope mirror technology has improved over the years, measurements that are made in recent years have more weight among scientists than those that are made long ago.
Paradoxically, the brightest stars are among the least studied by astronomers, but there is at least one recent effort to catalog their luminosity. A constellation of satellites called BRITE (BRight Target Explorer) will measure the variability of brightness between stars. Participants in the six-satellite project include Austria, Canada and Poland. The first two satellites launched successfully in 2013.
Top 26 brightest stars, as seen from Earth
|Sun||n/a||-26.72||4.2||93 million miles|
|Sirius||Canis Major||-1.46||1.4||8.6 light-years|
|Procyon||Canis Minor||0.38||2.6||11.4 ly|
|Fomalhaut||Piscis Austrinis||1.16||2.0||22 ly|
|Adhara||Canis Major||1.50||-4.8||570 ly|
|(Source: Chris Dolan, University of Wisconsin-Madison Department of Astronomy. He adapted it from Norton's 2000.0, 18th edition (1989) along with Bill Baity's Sky Pages.)|