Ethan Epperly

  • Program Year: 4
  • Academic Institution: California Institute of Technology
  • Field of Study: Applied and Computational Mathematics
  • Academic Advisor: Joel Tropp
  • Practicum(s):
    Lawrence Berkeley National Laboratory (2021)
  • Degree(s):
    B.S. Mathematics, and B.S. Computing, University of California, Santa Barbara, 2020
  • Personal URL:

Summary of Research

Ethan's research focuses on the design and analysis of algorithms for large-scale linear algebra problems, with a focus on randomized methods. Ethan's work can be used to predict molecular properties in large chemistry datasets, identify the most important nodes in a large network, and robustly solve challenging physics problems on quantum computers.


Z. Ding, E. N. Epperly, L. Lin, & R. Zhang (2024). The ESPRIT algorithm under high noise: Optimal error scaling and noisy super-resolution. Foundations of Computer Science 2024, accepted.
E. N. Epperly (2024). Fast and forward stable randomized algorithms for linear least-squares problems. SIAM Journal on Matrix Analysis and Applications, accepted.
E. N Epperly, M. Meier, & Y. Nakatsukasa (2024). Fast randomized least-squares solvers can be just as accurate and stable as classical direct solvers. arXiv preprint arXiv:2406.03468 [math.NA].
H. Wilber, E. N. Epperly, & A. H. Barnett (2024). A superfast direct inversion method for the nonuniform discrete Fourier transform. arXiv preprint arXiv:2404.13223 [math.NA].
E. N. Epperly & J. A. Tropp (2024). Efficient error and variance estimation for randomized matrix computations. SIAM Journal on Scientific Computing.
E. N. Epperly, J. A. Tropp, & R. J. Webber (2024). XTrace: Making the most of every sample in stochastic trace estimation. SIAM Journal of Matrix Analysis and Applications.
E. N. Epperly & E. Moreno (2023). Kernel quadrature with randomly pivoted Cholesky. NeurIPS 2023, spotlight.
M. Diaz, E. N. Epperly, Z. Frangella, J. A. Tropp, & R. J. Webber (2023). Robust, randomized preconditioning for kernel ridge regression. arXiv preprint arXiv:2304.12465 [math.NA].
Y. Chen, E. N. Epperly, J. A. Tropp, & R. J. Webber (2022). Randomly pivoted Cholesky: Practical approximation of a kernel matrix with few entry evaluations. arXiv preprint arXiv:2207.06503 [math.NA].
E. N. Epperly, L. Lin, & Y. Nakatsukasa (2022). A theory of quantum subspace diagonalization. SIAM Journal of Matrix Analysis and Applications.
Nithin Govindarajan, E. N. Epperly, & Lieven De Lathauwer (2022). (L_r,L_r,1) decompositions, sparse component analysis, and blind source separation. SIAM Journal of Matrix Analysis and Applications.
E. N. Epperly, N. Govindarajan, & S. Chandrasekaran (2021). Minimal rank completions for overlapping blocks. Linear Algebra and its Applications.
E. N. Epperly, A. T. Barker, and R. D. Falgout (2020). Smoothers for matrix-free algebraic multigrid preconditioning of high-order finite elements. LLNL Technical Report.
E. N. Epperly & R. B. Sills (2020). Transient solute drag and strain aging of dislocations. Acta Materialia.
E. N. Epperly & R. B. Sills (2020). Comparison of continuum and cross-core theories of dynamic strain aging. Journal of the Mechanics and Physics of Solids, 103944.
S. Chandrasekaran, E. N. Epperly, & N. Govindarajan (2019). Graph-induced rank structures and their representations. arXiv preprint arXiv:1911.05858 [math.NA].


ICIAM 2023 Student Travel Award, 2023
Thomas A. Tisch Prize for Graduate Teaching in CMS, 2022
UCSB Mathematics Department Raymond L Wilder Award, 2020
UCSB Chancellor's Award for Excellence in Undergraduate Research, 2020
2020 Hertz Foundation Fellowship Finalist
2020 NSF Graduate Research Fellowship (declined in favor of DOE CSGF)
Achievement in Mathematics Award, Las Positas College, 2016