Write a Blog >>

Backpropagation is a classic automatic differentiation algorithm computing the gradient of functions specified by a certain class of simple, first-order programs, called computational graphs. It is a fundamental tool in several fields, most notably machine learning, where it is the key for efficiently training (deep) neural networks. Recent years have witnessed the quick growth of a research field called differentiable programming, the aim of which is to express computational graphs more synthetically and modularly by resorting to actual programming languages endowed with control flow operators and higher-order combinators, such as map and fold. In this paper, we extend the backpropagation algorithm to a paradigmatic example of such a programming language: we define a compositional program transformation from the simply-typed lambda-calculus to itself augmented with a notion of linear negation, and prove that this computes the gradient of the source program with the same efficiency as first-order backpropagation. The transformation is completely effect-free and thus provides a purely logical understanding of the dynamics of backpropagation.

Slides (Backprop.pdf)184KiB

Wed 22 Jan
Times are displayed in time zone: (GMT-06:00) Saskatchewan, Central America change

POPL-2020-Research-Papers
15:35 - 16:40: Research Papers - Automatic Differentiation / Kleene Algebra at Ile de France II (IDF II)
Chair(s): Lars BirkedalAarhus University
POPL-2020-Research-Papers15:35 - 15:56
Talk
Link to publication DOI Media Attached
POPL-2020-Research-Papers15:56 - 16:18
Talk
Aloïs BrunelDeepomatic, Damiano MazzaCNRS, Michele PaganiIRIF - Université de Paris
Link to publication DOI Media Attached File Attached
POPL-2020-Research-Papers16:18 - 16:40
Talk
Steffen SmolkaCornell University, Nate FosterCornell University, Justin HsuUniversity of Wisconsin-Madison, USA, Tobias KappéUniversity College London, Dexter KozenCornell University, Alexandra SilvaUniversity College London
Link to publication DOI Media Attached