log in  |  register  |  feedback?  |  help  |  web accessibility
Logo
You Should Be Using Automatic Differentiation
Wednesday, November 11, 2015, 11:00 am-12:00 pm Calendar
  • You are subscribed to this talk through .
  • You are watching this talk through .
  • You are subscribed to this talk. (unsubscribe, watch)
  • You are watching this talk. (unwatch, subscribe)
  • You are not subscribed to this talk. (watch, subscribe)
Abstract

A big part of machine learning is optimization of continuous
functions.  Whether for deep neural networks, structured prediction,
or variational inference, machine learners spend a lot of time taking
gradients and verifying them.  It turns out, however, that computers
are good at doing this kind of calculus automatically, and automatic
differentiation tools are becoming more mainstream and easier to use.
In this talk, I will give an overview of automatic differentiation,
with a particular focus on Autograd, a tool my research group is
developing for Python.  I will also give several vignettes about using
Autograd to learn hyperparameters in neural networks, perform
variational inference, and design new organic molecules.  This is
joint work with David Duvenaud and Dougal Maclaurin.

Bio

Ryan Adams is Head of Research at Twitter Cortex and an Assistant
Professor of Computer Science at Harvard.  He received his Ph.D. in
Physics at Cambridge as a Gates Scholar.  He was a CIFAR Junior
Research Fellow at the University of Toronto before joining the
faculty at Harvard.  He has won paper awards at ICML, AISTATS, and
UAI, and his Ph.D. thesis received Honorable Mention for the Savage
Award for Theory and Methods from the International Society for
Bayesian Analysis.  He also received the DARPA Young Faculty Award and
the Sloan Fellowship.  Dr. Adams was the CEO of Whetlab, a machine
learning startup that was recently acquired by Twitter, and co-hosts
the Talking Machines podcast.

This talk is organized by Naomi Feldman