Staging Automatic Differentiation with Fusion
Automatic differentiation (AD) is a family of algorithms with many applications in scientific computing and machine learning. They compute numerical derivatives by interpreting or transforming the source code of numeric expressions. The correctness and efficiency of such AD algorithms has been the subject of much research, in particular that of reverse-mode AD which is particularly efficient for computing large gradients, but the code it generates is highly non-obvious.
In this paper, we build on the earlier approach of van den Berg et al. [Science of Computer Programming, 231 (2024)], reverse-engineering their algorithm in terms of established program transformation techniques from abstract algebra and functional programming. We transform the existing interpreter-based approach into an efficient code generator by means of staging with Template Haskell, and leverage a fold fusion variant for staging and algebraic preservation maps to perform correct-by-construction binding time improvements.
The resulting generated code achieves speedups up to a factor of 110.
Thu 16 OctDisplayed time zone: Perth change
16:00 - 17:30 | |||
16:00 30mResearch paper | A Clash Course in Solving Sudoku (Functional Pearl) Haskell Gergő Érdi Standard Chartered Bank Pre-print | ||
16:30 30mResearch paper | Staging Automatic Differentiation with Fusion Haskell | ||