Apparent Magnitude and Absolute Magnitude

The apparent magnitude of a celestial body is the measure of its brightness as seen by an observer on Earth, in the absence of the atmosphere.

The Apparent Magnitude of some familiar objects:
The Sun  -26.73
Full Moon -12.6
Maximum brightness of Venus -4.6
Maximum brightness of Mars -2.9
Maximum brightness of Jupiter -2.9
Sirius, the brightest star in the night sky -1.47
The bright star Vega 0.0
Approximate faint limit of a naked eye observer under ideal conditions 6.5
Faintest stars visible in 9x50 binoculars 9.5
Faintest stars visible in my 12" telescope from an urban site under an average moonless sky 15.7
Faintest stars visible to HST in visual wavelengths 30

Naturally, if all stars were the same brightness, the closer stars would be brighter and the further stars would be fainter. In fact, you could measure how far away a star was just by measuring how bright it appears.

But things in Nature are never that simple. In our Universe there are small, average stars very close by to Earth that appear quite bright, and humongous, brilliant stars very far away that appear just as bright or brighter as seen from Earth. In order to compare apples to apples, as it were, we need a system that evens the playing field. That is where absolute magnitude comes in to play.

In astronomy, absolute magnitude measures an object's actual intrinsic brightness. The absolute magnitude equals the apparent magnitude an object would have if it were at a standard distance (10 parsecs, or 1 Astronomical Unit, depending on object type) away from the observer. This allows the true brightnesses of objects to be compared without regard to distance.

No comments: