After finishing his first year in physics at the University of Maryland, Noah Mandell got a plum assignment: spending a summer observing and assisting in research at CERN, the European laboratory that’s home to the world’s biggest particle collider.
But Mandell, now a Princeton University doctoral candidate in plasma physics, was put off by his project’s limited scope, in part due to the size of the collaborations that run experiments.
“It was an OK experience,” says the Department of Energy Computational Science Graduate Fellowship (DOE CSGF) recipient. “I wanted to be more involved in what I was doing and try to have a bigger overall impact.”
In the next school year, Mandell met William Dorland, a physics professor who studies nuclear fusion – the melding of atomic nuclei to produce nearly limitless energy. Scientists have tried to harness fusion, the process fueling the sun and other stars, for decades but have yet to overcome the many technical challenges.
Mandell so impressed Dorland that he took him along on month-long annual trips to England’s Oxford University to work with collaborators there. The immersion in research, including the accompanying discussions and lab talks, reaffirmed his desire to pursue graduate work in fusion, Mandell says.
Mandell focuses on turbulence in the incredibly hot plasma scientists must create and contain to trigger fusion. As an undergraduate, he helped devise GryfX, a computer code that simulates turbulence in the cores of plasmas contained in tokamaks, donut-shaped fusion reactors. GryfX runs on graphics processing units, computer chips used to accelerate calculations while minimizing energy consumption.
Mandell continued working with Dorland during his first couple of years at Princeton, where he studies with Greg Hammett. He now concentrates on modeling turbulence in the plasma’s edge.
Core turbulence is unwanted, dispersing heat needed for fusion. By reducing core turbulence, “you can make your reaction stay hotter for longer,” hopefully becoming self-sustaining. But edge turbulence is more desirable, diffusing heat that could damage the reactor walls. It’s also more difficult to model, with larger fluctuations, more significant electromagnetic effects and plasma interactions with material in the reactor walls.
Researchers at Princeton, home to a DOE plasma physics laboratory, have developed XGC, an edge turbulence model that uses a particle-in-cell (PIC) approaching, tracking individual plasma components as they race through the reactor. Gkeyll, the code Mandell, Hammett and researcher Ammar Hakim are developing, is a continuum code that models a distribution of particles throughout the space. They believe Gkeyll will be more efficient than XGC, which can take days to run a simulation on even the most powerful computers. “We think it would be good to have another code” that can reproduce XGC’s results with a different method, boosting confidence in the outcomes.
Using computers at DOE’s National Energy Research Scientific Computing Center and at the Texas Advanced Computing Center, Mandell is extending Gkeyll to include the effects of electromagnetic (EM) fluctuations. PIC codes have struggled with these perturbations, but continuum models have handled them stably in core models. Mandell’s early results indicate his approach also should successfully calculate EM fluctuations in the plasma edge.
Mandell added machine-learning skills to his algorithmic repertoire while on his 2017 practicum at Lawrence Livermore National Laboratory. The goal was to help researchers tune parameters for hydrodynamic simulations, setting the right conditions for a moving computational mesh that divides the modeled domain for faster calculations.
“You end up with the users having to fiddle with all these parameters every time something goes wrong, and it becomes really challenging to run the simulations to completion,” Mandell says. The Livermore team thought “maybe we can teach the simulation how to handle all these mesh issues by itself with machine-learning techniques.”
The plan Mandell and advisor Ming Jiang arrived at uses visualizations – snapshots of the simulation data – to train a recurrent convolutional neural network – an algorithm that attempts to emulate the brain – to identify and predict mesh failures. To test the algorithm, they tried to reproduce the parameters an expert researcher set to keep the simulation from crashing based only on what it looked like previously.
“We were able to predict where the mesh was relaxed” to avoid failure “pretty well,” Mandell says. “We weren’t able to predict very well how much the experts relaxed the mesh.” More importantly, it was useful to learn “how to think about how you could combine machine learning in these large-scale simulations.”
Mandell is uncertain where he’ll apply his abilities after graduation, expected in 2020, but he’s sure he wants to continue working on fusion.
Image caption: A snapshot of turbulent electron density fluctuations in a Gkeyll simulation that models plasma dynamics in the edge of the National Spherical Torus Experiment at the Princeton Plasma Physics Laboratory. Credit: Noah Mandell.