Automatic Differentiation in Julia

Miles Lubin, Massachusetts Institute of Technology

Photo of Miles Lubin

Automatic differentiation (AD) or algorithmic differentiation is a set of techniques used for computing exact numerical derivatives of user-provided code, ideally without requiring invasive modifications. Julia, a recently developed high-level language for scientific computing, has a number of technical advancements, namely just-in-time compilation, metaprogramming, and multiple dispatch, that can change the way AD is implemented in practice, making it both more user friendly and efficient. We present an implementation of forward- and reverse-mode AD for computing exact sparse second-order derivatives that is now integrated into JuMP, an open-source modeling language embedded in Julia and capable of expressing and solving nonlinear programming problems with connections to state-of-the-art solvers such as Ipopt.

Abstract Author(s): Miles Lubin