### Carson Kent

Stanford University

Carson Kent’s tenure as a Department of Energy (DOE) researcher predates his Ph.D. work at Stanford University as a DOE Computational Science Graduate Fellowship (DOE CSGF) recipient. The department has supported his work since his Albuquerque High School days, thanks to the New Mexico Supercomputing Challenge, a competition to boost interest in high-performance computing (HPC) among pre-college students.

After meeting with potential Sandia National Laboratories mentors as part of the program, Kent created a hydrocode – a program that could simulate fluids flowing fast enough to create shock waves for evaluating whether hyper-powered water flows could be shaped to break up an improvised explosive device.

“Of course the one involving explosives is the one any high school student chooses,” recalls Kent, while noting that they didn’t actually blow up anything.

At Stanford, Kent works in optimization, “the mathematics of efficiency. What we’re really interested in are algorithms for computing solutions to problems. That means you define some measure of cost and find a method to reduce that cost as much as possible.”

Kent focuses on optimal transport – “the mathematical study of the most efficient way to transport the mass of one probability distribution into the mass of another probability distribution.” Or, as Stanford advisor Jose Blanchet explains, the cheapest way to “move mass from one place to another” – rearranging sand, say, to cover a sinkhole. Optimal transport “is a problem that has been around for 250 years,” Blanchet adds.

Kent’s an expert, Blanchet says, at “developing and analyzing algorithms for a wide range of problems – complex, large-scale, high-dimensional computational problems.” The math can apply to a variety of other puzzles, such as matching a donated kidney with a patient-recipient and linking commercial products with customers.

Kent entered the fellowship after starting at Stanford in 2015. His DOE CSGF practicum with Argonne National Laboratory’s Sven Leyffer exposed Kent to the field that would become the setting for his thesis work in optimization. He also collaborated with Leyffer on a related subject, “robust optimization,” which Kent describes as “taking optimization and then adding uncertainty.” During his practicum “we worked on faster, better methods for solving those types of problems under difficult constraints and conditions that the Department of Energy cares about.”

A poster presented as part of his fellowship described another related, popular problem-solving technique. It works on linear programs, a “simple way of expressing a bunch of costs associated with a bunch of decisions.” For instance, a linear program can model how Amazon would route packages to customers.

After high school, in 2010, Kent’s supercomputing challenge experience led to work with other Sandia teams on methods to detect malware in certain Windows files and on using geographic information system web applications to simulate the Western U.S. power grid for ways to detect and prevent failures.

More Sandia collaborations on other real-world projects continued while he attended the Colorado School of Mines. Each summer he continued interning at Sandia, and for his last three undergraduate years he also telecommuted with the lab during the winter.

The fruits of those interactions included leading development of a tool called Cyber Shopper that sought to model an adversary’s actions when attempting to access a computer system. This led to a U.S. patent on a method and apparatus for managing such an attack.

In college, Kent also picked up a research interest in uncertainty quantification, which aims to calculate how much trust researchers can put in their computational models of real-world conditions. It began in the summer of his junior year, when Paul Constantine, a Mines professor, invited him to take a short course on the subject at Stanford.

When they returned to Colorado, Kent began a research project with Constantine and a University of Texas at Austin professor that led to a 2016 paper. It described efforts to reduce high-dimensional parameter spaces that pop up when using the Markov chain Monte Carlo (MCMC) method, a computational workhorse to solve Bayesian inverse problems – a mathematically rigorous way, Kent says, “to combine prior knowledge with simulated and observed data.” MCMC is a class of algorithms for sampling among probability distributions that can represent physical quantities of interest in a simulation problem.

Kent calls the fellowship “amazing. I’ve definitely spent a while around the Department of Energy and there are few other programs in it that I value as highly as the DOE CSGF in terms of its ability to build the workforce that the department constantly needs.”

**Image caption:** Smoothed versions of the paths taken by various first-order optimization algorithms – mathematical techniques for finding a problem's most efficient solution – testing various candidate solutions. These optimization methods minimize the loss/objective function (the property the algorithm seeks to minimize or maximize) pictured here. The gold and black trajectories arise from accelerated first-order methods – ones that maximize efficiency by considering past solutions the algorithm has tested – and converge to the minimum much faster than the standard, gradient descent algorithm (red), even though the paths they take to the solution appear to vary much more. *Credit: Carson Kent.*