In classrooms and everyday conversation, explanations of global warming hinge on the greenhouse gas effect. In short, climate depends on the balance between two different kinds of radiation: The Earth absorbs incoming visible light from the sun, called “shortwave radiation,” and emits infrared light, or “longwave radiation,” into space. Upsetting that energy balance are rising levels of greenhouse gases, such as carbon dioxide (CO2), that increasingly absorb some of the outgoing longwave radiation and trap it in the atmosphere. Energy accumulates in the climate system, and warming occurs. But in a paper out this week in the Proceedings of the National Academy of Sciences, MIT researchers show that this canonical view of global warming is only half the story. In computer modeling of Earth’s climate under elevating CO2 concentrations, the greenhouse gas effect does indeed lead to global warming. Yet something puzzling happens: While one would expect the longwave radiation that escapes into space to decline with increasing CO2, the amount actually begins to rise. At the same time, the atmosphere absorbs more and more incoming solar radiation; it’s this enhanced shortwave absorption that ultimately sustains global warming. “The finding was a curiosity, conflicting with the basic understanding of global warming,” says lead author Aaron Donohoe, a former MIT postdoc who is now a research associate at the University of Washington’s Applied Physics Laboratory. “It made us think that there must be something really weird going in the models in the years after CO2 was added. We wanted to resolve the paradox that climate models show warming via enhanced shortwave radiation, not decreased longwave radiation.” Donohoe, along with MIT postdoc Kyle Armour and others at Washington, spent many a late night throwing out guesses as to why climate models generate this illogical finding before realizing that it makes perfect sense — but for reasons no one had clarified and laid down in the literature. They found the answer by drawing on both computer simulations and a simple energy-balance model. As longwave radiation gets trapped by CO2, the Earth starts to warm, impacting various parts of the climate system. Sea ice and snow cover melt, turning brilliant white reflectors of sunlight into darker spots. The atmosphere grows moister because warmer air can hold more water vapor, which absorbs more shortwave radiation. Both of these feedbacks lessen the amount of shortwave radiation that bounces back into space, and the planet warms rapidly at the surface. Meanwhile, like any physical body experiencing warming, Earth sheds longwave radiation more effectively, canceling out the longwave-trapping effects of CO2. However, a darker Earth now absorbs more sunlight, tipping the scales to net warming from shortwave radiation. “So there are two types of radiation important to climate, and one of them gets affected by CO2, but it’s the other one that’s directly driving global warming — that’s the surprising thing,” says Armour, who is a postdoc in MIT’s Department of Earth, Atmospheric and Planetary Sciences. Out in the real world, aerosols in air pollution act to reflect a lot of sunlight, and so Earth has not experienced as much warming from shortwave solar radiation as it otherwise might have. But the authors calculate that enough warming will have occurred by midcentury to switch the main driver of global warming to increased solar radiation absorption. The paper is not challenging the physics of climate models; its value lies in helping the community interpret their output. “While this study does not change our understanding of the fundamentals of global warming, it is always useful to have simpler models that help us understand why our more comprehensive climate models sometimes behave in superficially counterintuitive ways,” says Isaac Held, a senior scientist at NOAA’s Geophysical Fluid Dynamics Laboratory who was not involved in this research. One way the study can be useful is in guiding what researchers look for in satellite observations of Earth’s radiation budget, as they track anthropogenic climate change in the decades to come. “I think the default assumption would be to see the outgoing longwave radiation decrease as greenhouse gases rise, but that’s probably not going to happen,” Donohoe says. “We would actually see the absorption of shortwave radiation increase. Will we actually ever see the longwave trapping effects of CO2 in future observations? I think the answer is probably no.” The study sorts out another tricky climate-modeling issue — namely, the substantial disagreement between different models in when shortwave radiation takes over the heavy lifting in global warming. The authors demonstrate that the source of the differences lies in the way in which a model represents changes in cloud cover with global warming, another big factor in how well Earth can reflect shortwave solar energy. Reference: “Shortwave and longwave radiative contributions to global warming under increasing CO2” by Aaron Donohoe [email protected], Kyle C. Armour, Angeline G. Pendergrass, and David S. Battisti, 10 November 2014, Proceedings of the National Academy of Sciences.DOI: 10.1073/pnas.1412190111 The work was supported by the National Oceanographic and Atmospheric Administration, the James S. McDonnell Foundation, and the National Science Foundation.