Skip to main content

Sunspots and Common Sense

Like a freckle on the face of our friendly neighborhood star, a sunspot shows up as a dark area on the sun. Just as one might suspect, sunspots are cooler than their surroundings. They can show so much contrast to the hotter and brighter regions of the sun's surface that observers with unaided eyes on Earth have been able to pick out large ones when the sun is very low on the horizon.

Logically, the more cool dark sunspots there are, the dimmer and cooler the sun should be, and so scientists (and everyone else) believed for a long time. But studies reported recently in the magazine Science indicate that the sun doesn't work that way.

The first real challenge to the common-sense assumption that many big sunspots would mean cool weather on Earth came in 1976. John Eddy of the National Center for Atmospheric Research reported then that the Maunder Minimum, the period between 1645 and 1715 when sunspots were nearly absent, correlated with a period of extremely cold weather in Europe - the coldest years in a generally chilly time known as the Little Ice Age.

This annoying result required more investigating. Fortunately for solar scientists, the sun normally runs a long-term natural experiment: the number of sunspots peaks, declines, and rises again, with the trough-to-trough cycle taking 11 years. Also fortunately, technology keeps providing scientists with better tools.

As they reported at the American Geophysical Union meeting in May this year, researchers Peter Foukal and Judith Lean put these opportunities together. Using two different satellite-born radiometers, which are highly sensitive devices to measure the amount of radiant power per unit area coming from the sun, they followed the sun's changes as the sunspot cycle headed toward its lowest point last fall.

What they found was a slowly (and temporarily) dimming sun. The numbers do not seem overwhelming - for example, the solar irradiance declined by seven hundredths of a percent (0.07) from 1981 to 1984. But given that a decrease of only a tenth of a percent lasting for a decade might change global climate perceptibly, that much dimming isn't trivial.

Foukal and Lean think they may even understand why the sun fades as sunspots become less numerous and prominent. Properly filtered photographs of the sun's surface reveal that the dark spots are accompanied by intensely bright blotches, called plage or faculae, and by finer features known as network radiation. The network radiation, which looks a bit like the road map of a badly planned city drawn with incandescent ink, is barely visible compared to the dark sunspots and brilliant plage and is dispersed away from their vicinity.

As the sunspots grow sparse, so too do the bright features diminish. Their loss more than compensates for the apparent gain in irradiance the spotless sun should have, and the net result is a dip in brightness.

Thus, no matter how much it outrages common sense, a pox of sunspots apparently means that the sun is brighter than it would be without them. The radiometers will be watching carefully while the sunspots cycle through the next 11 years; by 1997, we should know for sure.