# All Science Fair Projects

## Science Fair Project Encyclopedia for Schools!

 Search    Browse    Forum  Coach    Links    Editor    Help    Tell-a-Friend    Encyclopedia    Dictionary

# Science Fair Project Encyclopedia

For information on any area of science that interests you,
enter a keyword (eg. scientific method, molecule, cloud, carbohydrate etc.).
Or else, you can start by choosing any of the categories below.

# Apparent magnitude

The apparent magnitude (m) of a star, planet or other heavenly body is a measure of its apparent brightness; that is, the amount of light received from the object. Hundred times less bright (e.g. the same object ten times as far) corresponds to an apparent magnitude that is five more; 2.512 times less bright (e.g. the same object 1.585 times as far) corresponds to an apparent magnitude that is one more, 2.512 being the fifth root of 100 (1000.2).

As the amount of light received actually depends on the thickness of the atmosphere in the line of sight to the object, the apparent magnitudes are normalized to the value it would have outside the atmosphere. The dimmer an object appears, the higher its apparent magnitude. Note that apparent brightness is not equal to actual brightness — an extremely bright object may appear quite dim, if it is far away. The rate at which apparent brightness changes, as the distance from an object increases, is calculated by the inverse-square law (at cosmological distance scales, this is no longer quite true because of the curvature of space). The absolute magnitude, M, of a star or galaxy is the apparent magnitude it would have if it were 10 parsecs away; that of a planet (or other solar system body) is the apparent magnitude it would have if it were 1 astronomical unit away from both the Sun and Earth. The absolute magnitude of the Sun is +4.83 in the V band (yellow) and +5.48 in the B band (blue).

Scale of apparent magnitudes
App. Mag. Celestial object
−26.8 Sun
−12.6 full Moon
−4.4 Maximum brightness of Venus
−2.8 Maximum brightness of Mars
−1.5 Brightest star at visible wavelengths: Sirius
−0.7 Second brightest star: Canopus
0 The zero point by definition: in olden days this
was Vega (see references for modern zero point)
+3.0 Faintest stars visible in an urban neighborhood
+6.0 Faintest stars observable with naked eye
+12.6 Brightest quasar
+27 Faintest objects observable in visible light
with 8m ground-based telescopes
+30 Faintest objects observable in visible light
with Hubble Space Telescope
+38 Faintest objects observable in visible light
with planned OWL (2020)

The scale upon which magnitude is measured has its origin in the Hellenistic practice of dividing those stars visible to the naked eye into six magnitudes. The brightest stars were said to be of first magnitude (m = +1), while the faintest were of sixth magnitude (m = +6), the limit of human visual perception (without the aid of a telescope). Each grade of magnitude was considered to be twice the brightness of the following grade. This somewhat crude method of indicating the brightness of stars was popularized by Ptolemy in his Almagest, and is generally believed to have originated with Hipparchus. This original system did not measure the magnitude of the Sun. Because the response of the eye to light is logarithmic, the resulting scale is also logarithmic.

In 1856, Pogson formalized the system by defining a typical first magnitude star as a star which is 100 times brighter than a typical sixth magnitude star; thus, a first magnitude star is about 2.512 times brighter than a second magnitude star. The fifth root of 100, an irrational number about (2.512) is known as Pogson's Ratio . Pogson's scale was originally fixed by assigning Polaris a magnitude of 2. Astronomers later discovered that Polaris is slightly variable, so they first switched to Vega as the standard reference star, and then switched to using tabulated zero points for the measured fluxes (see second Reference below). The magnitude depends on the wavelength band (see below).

The modern system is no longer limited to 6 magnitudes. Really bright objects have negative magnitudes. For example, Sirius, the brightest star of the celestial sphere, has an apparent magnitude of −1.44 to −1.46. The modern scale includes the Moon and the Sun; the Moon has an apparent magnitude of −12.6 and the Sun has an apparent magnitude of −26.8. The Hubble and Keck telescopes have located stars with magnitudes of +30.

The apparent magnitude in the band x can be defined as

$m_{x}= -2.5 \log_{10} (F_x) + C\!\,$

where $F_x\!\,$ is the observed flux in the band x, and $C\!\,$ is a constant that depends on the units of the flux and the band. The constant $C\!\,$ is defined in Aller et al 1982 for the most commonly used system.

The second thing to notice is that the scale is logarithmic: the relative brightness of two objects is determined by the difference of their magnitudes. For example, a difference of 3.2 means that one object is about 19 times as bright as the other, because Pogson's ratio raised to the power 3.2 is 19.054607... The logarithmic nature of the scale is due to the fact of the human eye itself having a logarithmic response, see Weber-Fechner Law.

Magnitude is complicated by the fact that light is not monochromatic. The sensitivity of a light detector varies according to the wavelength of the light, and the way in which it varies depends on the type of light detector. For this reason, it is necessary to specify how the magnitude is measured in order for the value to be meaningful. For this purpose the UBV system is widely used, in which the magnitude is measured in three different wavelength bands: U (centred at about 350 nm, in the near ultraviolet), B (about 435 nm, in the blue region) and V (about 555 nm, in the middle of the human visual range in daylight). The V band was chosen for spectral purposes and gives magnitudes closely corresponding to those seen by the light-adapted human eye, and when an apparent magnitude is given without any further qualification, it is usually the V magnitude that is meant, more or less the same as visual magnitude.

Since cooler stars, such as red giants and red dwarfs, emit little energy in the blue and UV regions of the spectrum their power is often under-represented by the UBV scale. Indeed, some L and T class stars have an estimated magnitude of well over 100, since they emit extremely little visible light, but are strongest in infrared.

Measures of magnitude need cautious treatment and it is extremely important to measure like with like. On early 20th-century and older orthochromatic (blue-sensitive) photographic film, the relative brightnesses of the blue supergiant Rigel and the red supergiant Betelgeuse irregular variable star (at maximum) are reversed compared to what our eyes see since this archaic film is more sensitive to blue light than it is to red light.

For an object with a given absolute magnitude, 5 is added to the apparent magnitude for every tenfold increase in the distance to the object.