Updated Autograd Basics (markdown)

Richard Zou
2021-06-29 16:37:18 -04:00
parent 2567335d53
commit b79742f502

@ -1 +1,37 @@
Coming soon
## Scope
* Understand how backpropagation works in theory
* Understand how to derive backward formulas and how to add a backward formula to an operator
* Understand what a composite autograd operators is and when it is useful
* Know when to use gradcheck and custom autograd Functions
* (optional) Understand how the autograd graph gets built and executed
## Introduction to backpropagation
Read through [link](https://colab.research.google.com/drive/1aWNdmYt7RcHMbUk-Xz2Cv5-cGFSWPXe0).
## Given an operator, how do I derive a backward formula for it?
How to derive formula for torch.sin (real case) [link](https://colab.research.google.com/drive/1lUU5JUh0h-8XwaavyLuOkQfeQgn4m8zr).
How to derive formula for torch.mm (real case) [link](https://colab.research.google.com/drive/1z6641HKB51OfYJMCxOFo0lYd7viytnIG).
## Given a new operator, how do I write a new backward formula? (using derivatives.yaml)
Coming soon!
## When should I write a new backward formula?
Coming soon!
## How do I test an autograd formula?
Coming soon!
## What are custom autograd functions?
Coming soon!
## Lab: Derive the backward formula for a fake operator
Coming soon!