Alexander Turner

Harvard University

Atmospheric scientist Alexander Turner first skied at age 3. As he grew and developed his skills, he traded resort slopes for backcountry trails. In warmer seasons, he backpacked and rock-climbed.

Turner stayed active outside as an engineering student at his hometown University of Colorado at Boulder. There, climate change cast its shadow on the Rocky Mountain foothills. Pine beetles, thriving in the warming temperatures, were killing trees.

Since his sophomore year, Turner has worked with atmospheric and climate scientists to bring computational power to their research. Today, as a Department of Energy Computational Science Graduate Fellowship (DOE CSGF) recipient at Harvard University, he has followed his own trail in inverse modeling, using current observations to generate past scenarios. Using statistical methods, the atmospheric models identify emission sources that most likely led to the observations. “It’s like measuring the depth of a river and trying to figure out where it rained,” Turner says.

With atmospheric chemist Daniel Jacob, Turner has focused mainly on methane, the most abundant greenhouse gas after water vapor and carbon dioxide. Jacob wants to understand the main sources of atmospheric methane.

Turner's code superimposes patchy data onto a complete grid of prior data derived bottom-up from emissions sources and best-available assumptions. His method exploits the fact that similar regions host similar underlying processes. A smattering of observational data points in a large wetland, for example, can carry great weight and represent a large portion of the wetland.

Turner and Jacob worked with collaborators to develop a high-resolution estimate of global methane emissions. For a 31-month period from mid-2009 through 2011, they found that sources worldwide had released methane at a rate of 593 million tons per year.

Turner/DEIXIS

The team's results add to growing evidence that enormous amounts of methane escape notice in the Environmental Protection Agency (EPA) estimate of continental U.S. emissions. That bottom-up inventory pegs annual human-caused releases at 27 million tons per year. Turner's model indicates emissions were at least 60 percent higher, between 44 million and 47 million tons.

Turner's model also confirmed that U.S. methane emissions mainly come from south-central states active in raising livestock and extracting and processing fossil fuels. The group concluded that livestock accounts for most of the methane emissions attributable to human-directed activity, followed by oil and gas, landfills and wastewater, and finally coal.

Turner and Jacob also worked with six other researchers to measure methane emissions from 2002 through 2014. They combined data from surface, aircraft and satellite instruments and subtracted background methane levels (over oceans and upwind) from totals measured over land to find overall emission levels. They found a 30 percent increase for the 12-year period. The result contradicts EPA’s bottom-up inventory, which holds that U.S. emissions were essentially flat.

Through practicums at Lawrence Berkeley National Laboratory, Turner also worked with Ronald Cohen, a lab affiliate and chemistry and earth and planetary sciences professor at the University of California, Berkeley, on research into carbon dioxide (CO2). Cohen leads a team that’s building a CO2 sensor network in the San Francisco Bay Area, measuring emissions at a resolution high enough to attribute them to specific neighborhoods and to sources like factories and highways.

But how many sensors will such a network need to be reliable, yet affordable? And how precise, and therefore costly, do the sensors have to be?

Turner worked with Cohen to develop an inverse model that balances the trade-offs between the number of sensors and their precision.

Turner's code coupled a weather forecasting model with one designed to simulate the dispersion of gases and airborne particles. He wasn't the first to wed the two, but his version yields data at a one-square-kilometer scale. “We are using it at a much higher resolution than a lot of groups typically do because we’re considering urban sources,” Turner says. “There’s a lot of variability on small scales.”

He wrote the code from scratch in three months. Then, using Cori, a Cray XC40 supercomputer at the National Energy Research Scientific Computing Center, and university computing clusters, he simulated the performance of various hypothetical sensor networks. A web comprising many moderate-precision sensors worked best.

Read the entire article in DEIXIS, the DOE CSGF annual. [PDF, pages 16-18]

Image caption: A bottom-up estimate of average annual North American methane emissions for 2009-2011 (prior, top) and average annual emissions for the same period as estimated from satellite data (posterior, bottom). Total average annual emissions are 63.3 teragrams in the bottom-up estimate. The researchers’ estimate is more than 20 teragrams per year higher, at 91.3 teragrams annually. Colors indicate where methane emissions are highest, as measured in nanomoles (billionths of a mole) per square meter per second. Red indicates the highest emissions at 100 or more nanomoles per square meter per second, while blue and white are the lowest. Credit: From Turner et al., Estimating global and North American methane emissions with high spatial resolution using GOSAT satellite data, Atmos. Chem. Phys., 15, 7049-7069, 2015.