Stanford University science doctoral students, including Department of Energy Computational Science Graduate Fellowship (DOE CSGF) recipient Claire-Alice Hébert, typically rotate through a few of their department’s research groups in their first year, sampling various subjects before choosing one for their thesis work.
Hébert first worked with photonics and biophysics groups but wasn’t satisfied. “I knew I liked handling data sets and trying to find interesting information from them,” she says. Cosmology offered thorny data, so she chose to serve her final rotation with Patricia Burchat, who also was known as a great mentor.
Cosmology was “not something I had thought of,” Hébert says, but she realized “the questions are really hard. You’re trying to measure quantities that are hidden in noise. You have to be really clever about how you disentangle things.”
Finding correct signals is critical if cosmologists are to measure the universe’s properties rather than noise from the telescope or, in the case of Hébert’s research, distortions the atmosphere causes as light passes through.
Hébert and Burchat are among hundreds of researchers in the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration. The project will use the telescope (recently renamed the Vera Rubin Observatory), now under construction on a Chilean mountaintop, to track the evolution of dark energy, the mysterious force contributing to the universe’s accelerating expansion. To do so, the team will calculate the distribution of dark matter, which makes its presence known through gravity strong enough to bend light from distant galaxies in a process called gravitational lensing.
The team wants to measure the small changes in the galaxies’ shapes that gravitational lensing causes, so it’s critical that the images are free of interference from terrestrial sources – including the atmosphere.
Stars appear to twinkle “because their light travels through the atmosphere, which has wind and pressure variations that deflect the light on small scales,” Hébert says. These atmospheric perturbations also can alter galaxies’ appearances.
To correct for these effects, astronomers use models of the point spread function (PSF), which measures distortions from the atmosphere and telescope. To validate these models and understand the PSF for the observatory, Hébert compared predicted atmospheric behavior with observation data gathered from instruments sited near its location.
In addition to correcting for PSF distortions after images are captured, Hébert and her colleagues want to avoid them through smart telescope management. For example, PSF effects tend to wash out if images are captured over long exposures. Hébert wants to better understand how quickly that happens and how researchers can use it to optimize their imaging techniques.
Now Hébert is using data from a telescope located near the Rubin Observatory site that captures many short exposures of stars, in effect making a movie of how the PSF evolves. She compares that with environmental data, such as wind speed, simultaneously gathered. That will address such questions as “if the wind speeds are high, do we know that we’re going to also have a really elliptical PSF, or are they unrelated,” helping guide observations.
Besides running on Stanford computing clusters, Hébert has tapped Cori at DOE’s National Energy Research Scientific Computing Center and a SLAC National Accelerator Center machine.
Hébert’s 2018 Los Alamos National Laboratory (LANL) practicum – and a short sequel in 2019 – also focused on finding signals in a jumble of information. The target was spectra generated when a laser zaps a rock, producing a glowing plasma. The leading example: the Mars rover Curiosity’s ChemCam, which LANL scientists helped develop.
“The challenge with those spectra is it’s sometimes hard to disentangle what they mean,” especially what elements are present and in what quantities, Hébert says. Models can provide a guide, but with more than 100 elements, it would take years to calculate myriad possible combinations.
In her practicum with Kary Myers of LANL’s Statistical Sciences Group, Hébert tried two methods to reduce the problem’s dimensions and created emulators – smaller, faster programs built on selected solutions from the larger problem. Emulators predict approximate results for potential parameter combinations and calculate uncertainty for each.
Hébert developed and used emulators to analyze a simulated spectrum and calibrated them with a Markov chain Monte Carlo (MCMC) routine, which samples parameters to approximate solutions. In 2019 she compared the emulators and worked to improve the MCMC code’s performance – a task she’s still tackling. Hébert expects to present the work at a conference.
Hébert says the practicums confirmed her interest in data analysis. She’ll graduate in 2021 or early 2022 – about when the Rubin Observatory begins operating. Hébert hopes she can continue working with the project, using information from the real thing.
Video caption: This shows a series of images taken over 20 seconds by the Zorro instrument at the Gemini South Observatory, less than 2 km from the Vera Rubin Observatory (formerly the Large Synoptic Survey Telescope) site in Chile. Both panels show the point spread function (PSF): on the left is the near instantaneous image (60 milliseconds exposure) taken at the exposure time given in the title, while the right panel shows the PSF accumulated till that time. The cumulative image quickly loses its initial speckled nature and becomes smooth and regular with the addition of enough images. Credit: Claire-Alice Hébert.