You are on page 1of 1

Introduction to Automatic Differentiation

autores: Atilim G�une�s Baydin,Barak A. Pearlmutter,Alexey Andreyevich


Radul,Jeffrey Mark Siskind

Journal of Machine Learning Research 18


Abril/2018

Artigo

O que eu entendi:
Deriva��o Simb�lica necessita como entrada de dados a forma fechada da equa��o,
deriva��o simb�lica n�o apenas a entrada do algoritmo

Deriva��o Automatica pode derivar n�o s� equa��es em forma fechada como tamb�m
sequ�ncias de c�digo

Expression swell

Cita��es:
Following the emergence of deep learning (LeCun et al., 2015; Goodfellow et al.,
2016) as the state-of-the-art in many machine learning tasks
and the modern workflow based on rapid prototyping and code reuse in frameworks
such as Theano (Bastien et al., 2012), Torch (Collobert et al., 2011), and
TensorFlow (Abadi et al.,
2016), the situation is slowly changing where projects such as autograd2
(Maclaurin, 2016), Chainer3 (Tokui et al., 2015), and PyTorch4
(Paszke et al., 2017) are leading the way in bringing general-purpose AD to the
mainstream. P�gina 2

We would like to stress that AD as a technical term refers to a specific family of


techniques that compute derivatives through accumulation of values
during code execution to generate numerical derivative evaluations rather than
derivative, expressions. P�gina 2

structural mechanics (Haase et al., 2002)

An important point to note here


is that AD can differentiate not only closed-form expressions in the classical
sense, but also
algorithms making use of control flow such as branching, loops, recursion, and
procedure

You might also like