Skip to main content

Gil Goldshlager

Headshot of Gil Goldshlager
Program Year:
4
University:
University of California, Berkeley
Field of Study:
Applied Mathematics
Advisor:
Lin Lin
Degree(s):
B.S. Mathematics with Computer Science, Massachusetts Institute of Technology, 2017

Summary of Research

I develop algorithms for simulating atoms, molecules, and materials at the level of quantum physics. This is traditionally known as the electronic structure problem since the behavior of the electrons dictates most relevant properties. I am drawn to electronic structure because it is an area in which fundamental and fascinating algorithmic challenges intersect with important scientific and technological applications. I am particularly motivated by the potential for better electronic structure algorithms to contribute to the development of environmentally sustainable technologies. My current focus is on developing optimization algorithms for neural network wavefunctions, which are a promising new approach for making high-accuracy predictions for challenging strongly correlated systems.

For me information, visit ggoldshlager.com.

Publications

Fast Convergence Rates for Subsampled Natural Gradient Algorithms on Quadratic Model Problems. Gil Goldshlager, Jiang Hu, Lin Lin. August 2025, arXiv preprint arXiv:2508.21022.

Expressivity of determinantal anzatzes for neural network wave functions. Ni Zhan, William A. Wheeler, Gil Goldshlager, Elif Ertekin, Ryan P. Adams, Lucas K. Wagner. July 2025, arXiv preprint arXiv:2506.00155.

Improving Energy Natural Gradient Descent through Woodbury, Momentum, and Randomization. Andres Guzman-Cordero, Felix Dangel, Gil Goldshlager, and Marius Zeinhofer. May 2025, arXiv preprint arXiv:2505.12149.

Worth Their Weight: Randomized and Regularized Block Kaczmarz Algorithms without Preprocessing. Gil Goldshlager, Jiang Hu, and Lin Lin. February 2025, arXiv preprint arXiv:2502.00882.

Randomized Kaczmarz with Tail Averaging. Ethan Epperly, Gil Goldshlager, Robert J. Webber. December 2024, arXiv preprint arXiv:2411.19877.

A Kaczmarz-inspired approach to accelerate the optimization of neural network wavefunctions. Gil Goldshlager, Nilin Abrahamsen, Lin Lin. November 2024, Journal of Computational Physics https://doi.org/10.1016/j.jcp.2024.113351.

Convergence of stochastic gradient descent on parameterized sphere with applications to variational Monte Carlo simulation. Nilin Abrahamsen, Zhiyan Ding, Gil Goldshlager, and Lin Lin. September 2024, Journal of Computational Physics https://doi.org/10.1016/j.jcp.2024.113140.

Explicitly antisymmetrized neural network layers for variational Monte Carlo simulation. Jeffmin Lin, Gil Goldshlager, and Lin Lin. Feb 2023, Journal of Computational Physics https://doi.org/10.1016/j.jcp.2022.111765.

Faucet: streaming de novo assembly graph construction. Roye Rozov, Gil Goldshlager, Eran Halperin, Ron Shamir. January 2018, Bioinformatics https://doi.org/10.1093/bioinformatics/btx471.

Approximating kCSP for large alphabets. Gil Goldshlager and Dana Moshkovitz. Technical report available at http://people.csail.mit.edu/dmoshkov/papers/Approximating%20MAX%20kCSP.pdf.

Awards

Simons Dissertation Fellow in Mathematics; 2025-2027.
H2H8 Graduate Research Grant Receipient; 2025-2026.
Phi Beta Kappa recipient; 2017
Morais and Rosenblum Award for outstanding undergraduate research; May 2014
USA Junior Math Olympiad (USAJMO) winner; Spring 2009