Thomas Anderson

California Institute of Technology

Thomas Anderson isn’t a physicist, but his applied mathematics skills let him address problems similar to those faced by one.

With mathematical tools, researchers needn’t run experiments and derive new physics models, says Anderson, a doctoral candidate at the California Institute of Technology. It helps to understand the behavior in a simulated phenomenon, but “in general, you just need to know the mathematical model that resulted from that physical scenario.”

Anderson, a Department of Energy Computational Science Graduate Fellowship (DOE CSGF) recipient, is drawn to applied math’s rigor and the avenue it provides to decipher complex conditions. In some cases, he seeks efficiencies in existing high-accuracy solutions or solutions with desirable properties. In others, he tailors methods to a problem. His research with advisor Oscar Bruno could help speed simulations for aerospace, additive manufacturing and other applications as computers grow larger and faster.

In one project, Anderson tackled calculations of moving boundaries, complex problems found in fluid mechanics and other fields. To quickly model a physical domain, mathematicians divide it with a mesh and calculate the physical processes in each region. Boundaries that move across the mesh as the simulation progresses twist the calculations into nonlinear equations and can create problematic tiny computational regions. “My research is to solve the free-boundary problem but with a better approach to avoid worrying about these small-cell problems” while maintaining or improving the algorithm’s precision and efficiency.

Anderson uses Fourier continuation, a method Bruno developed with Mark Lyon of the University of New Hampshire. Anderson extended the method to solving the irregular geometries of moving boundaries with a high level of accuracy and efficiency.

In another project, Anderson developed a new approach to calculating how waves, such as electromagnetic energy or sound, scatter when they encounter a solid obstacle. It’s a well-understood problem, he says, but his research tackled the more complicated issue of waves that vary in time.

Anderson: Waves

Standard methods to capture this behavior use time-stepping, which incrementally solves the relevant equations as the simulation moves forward. Under that technique, “if you want to solve to very large times, you need to do a lot of work,” Anderson says. His approach uses Fourier decoupling, which separates the equations from time, solves many of them in frequency space, and then synthesizes the calculations with the time domain. The amount of work needed at any time is essentially fixed. The method is more efficient and accurate than the usual approaches.

For his 2016 DOE CSGF practicum at Lawrence Livermore National Laboratory, Anderson worked to tune a technique so it runs more efficiently on high-performance computers that use graphics processing units (GPUs) to accelerate calculations.

GPUs are efficient but can require frequent memory access, consuming energy and time. The group led by Anderson’s practicum advisor, Tzanio Kolev, wants to use GPUs in their calculations but minimize how often they move information to and from memory.

“From their perspective, the most computing you can do on the same amount of information, the better,” Anderson says. The natural course is to use comparatively precise high-order methods. “You have to do a lot of work for a fixed amount of input data, but the accuracy you get for that unit of work is high.”

The finite-element method Kolev’s group generally uses to divide the modeled domain can produce calculations so demanding they sometimes can’t be loaded into memory. Mathematical methods to cope with that issue can be slow. To help, mathematicians tap a preconditioner, which accelerates the calculations or reduces the number of iterations needed to converge on an answer. Preconditioners transform problems into a form that’s more amenable to numerical solution.

In his practicum, Anderson developed a preconditioner that relied on a less precise, lower-order solution method but applied it to a finite-element mesh that is more tightly spaced, modeling the domain with higher resolution. “We call this low-order refined,” he says. “Instead of high order on a coarse grid we went to low order on a finer grid,” reducing complexity.

Anderson measured how this multigrid approach accelerated solutions and implemented it in a parallel processing scheme on the group’s computing cluster. The algorithms have been incorporated in the Kolev group’s modular finite-element method code library and are available to users.

With graduation one to two years away, Anderson is considering his next move. He’s ruled out academia and finds DOE laboratory work intriguing but also is attracted to industry’s challenging problems.

Image caption: Simulation of time-varying wave scattering from a non-convex scatterer using new Fourier-based time-domain boundary integral equation methods. Credit: Thomas Anderson.