The Vera C. Rubin Observatory will undertake the most ambitious astronomical survey of the universe to date, repeatedly observing billions of stars and galaxies over the course of 10 years. This survey, called the Legacy Survey of Space & Time (LSST), will enable extensive studies of the universe, including cataloging nearby solar system objects, discovering variable stars, and precisely measuring dark energy – a mysterious source of energy that drives the accelerated expansion of space, of which very little is understood. One thing that sets the Rubin Observatory survey apart from previous efforts is the sheer quantity of images: The volume of data from this telescope will dwarf all previous astronomical surveys combined! This enormous volume (20 terabytes per night; hundreds of petabytes total) will require sophisticated and automated algorithms to process. Ultimately, it enables very precise measurements, in which the statistical uncertainties on measured quantities are unprecedentedly small, but does not guarantee accuracy. For accurate measurements, significant research efforts are necessary to understand, quantify, and mitigate potential biases that instrumentation or algorithms may introduce. Two effects that I have been studying are atmospheric turbulence, which distorts the images of faraway objects, e.g. galaxies that are billions of light years away, and correlated instrument noise. Both of these effects alter the shapes and sizes of objects we observe in correlated ways, which then impacts our measurements of correlations due to cosmological effects. These are some of the many challenges ahead for scientists preparing to use LSST data. In my talk, I will outline the scientific goals of LSST, the physical modeling and statistical and computational techniques I have been using to understand and quantify potential systematic biases, as well as some of the outstanding computational challenges for this ambitious survey.
Precision Cosmology With the Vera C. Rubin Observatory