Daniel Rey

University of California, San Diego

After earning his engineering physics bachelor’s degree, Daniel Rey spent four years with a Chicago company, developing software that models how radio antennae perform when mounted on things like aircraft.

It was great experience, says Rey, a Department of Energy Computational Science Graduate Fellowship (DOE CSGF) recipient. He delved into high-performance computing (HPC) and picked up programming skills, but he also knew he would return to school for a graduate degree.

It was only when he considered topics for his doctoral research at the University of California, San Diego, that Rey grasped the flipside of computer simulation: how to tie models to the real world, especially when determining a simulation’s initial conditions – its starting point. The task is difficult because there’s limited information about the beginning state and about factors influencing the model as it moves forward.

“How do you know where to start from if you can’t measure everything?” Rey asks. “If you have 10 variables or 100 variables or a million variables” affecting the model “and you can only measure half of them, you only know half of the information you need in order to actually start to run the model the way it needs to be run to make predictions.” The problem gets even more twisted when the modeled system is chaotic, like weather or brain activity.

Working with advisor Henry Abarbanel, Rey pursues new methods for data assimilation – connecting models with observational data about the dynamic systems they emulate, helping improve initial condition estimates and experimental predictions.

The estimation problem is ubiquitous in science, so Rey’s work could have wide applications. Over the years, researchers have devised multiple solutions, some general and others specific to the task at hand. Part of Rey’s research is to connect and distill common threads in these varied methods.

Rey: graph

Estimating initial conditions is an inverse problem: researchers start with a set of observations and work backward to approximate the circumstances leading to them. “When it comes to inverse problems, it’s all about bringing additional information to the table,” including real-world evidence and information about how the model itself behaves, Rey says.

Inverse problems typically have multiple solutions. Good dynamic models, combined with enough observations over time, provide constraints that help eliminate spurious trajectories the modeler doesn’t want to consider, Rey says. With sufficient observational data and model information, the calculations will converge on a hypothetical, true initial condition. Then researchers can run the model forward to make predictions and compare it with real-world experiments to determine its accuracy. These forecasts also are needed to validate the model and can help develop better experiments.

But “the thing about inverse problems that I’ve come to appreciate is that every one is different,” Rey says. There are general principles and mathematical frameworks, but “you can’t necessarily assume that what you did with one problem is going to work for another.”

Rey and Abarbanel apply these new methods to two projects. First, they want to understand brain cell activity at a biophysical level. Their model organism is the zebra finch. It’s known for singing a single song exactly the same, which establishes a baseline for tracking neuron function. University of Chicago biologists stimulate slices taken from the birds’ brains and record the resulting electrical activity. Rey and Abarbanel analyze the data, seeking to validate their models and narrow the search for specific brain-cell processes.

The team also works with climate models, which are generally well-supplied with satellite and sensor data and handle the estimation problem well. What’s not as well understood is the threshold at which there’s no longer enough information to accurately estimate the system's state. Rey wants to help understand when it’s better to improve the model or gather more observations, because without adequate data, even a perfect model will fail to make accurate predictions.

Rey says his role on these projects has been akin to a consultant. He focuses more on the methods than on their uses, “trying to be as agnostic to the application as possible.”

Rey’s summer 2014 Lawrence Livermore National Laboratory practicum addressed a similar issue in turbulence. Working with Gregory Burton of the Turbulence Analysis and Simulation Center, Rey analyzed results from the largest-ever direct numerical simulation of a turbulent coaxial jet with a high rate of mixing between weakly diffusive gases. The simulation used a method Burton developed to model small-scale turbulence features. Rey developed computational tools to analyze the data and visualize the flow structures.

After Rey graduates this spring, he and his family (including 1-year-old Fiona) will return to the Chicago area, where he’ll take a postdoctoral research position with Northwestern University Physics and Astronomy Prof. Adilson Motter.

Image caption: The estimation process behaves like the synchronization of coupled oscillators. At each time step, information flows from the data to the model, nudging the estimate toward the true trajectory. With enough measurements, the estimate eventually synchronizes with the truth. Image courtesy of Daniel Rey.