mirror of
https://github.com/pytorch/pytorch.git
synced 2025-10-20 21:14:14 +08:00
Updated Autograd Basics (markdown)
@ -1 +1,37 @@
|
||||
Coming soon
|
||||
## Scope
|
||||
* Understand how backpropagation works in theory
|
||||
* Understand how to derive backward formulas and how to add a backward formula to an operator
|
||||
* Understand what a composite autograd operators is and when it is useful
|
||||
* Know when to use gradcheck and custom autograd Functions
|
||||
* (optional) Understand how the autograd graph gets built and executed
|
||||
|
||||
## Introduction to backpropagation
|
||||
|
||||
Read through [link](https://colab.research.google.com/drive/1aWNdmYt7RcHMbUk-Xz2Cv5-cGFSWPXe0).
|
||||
|
||||
## Given an operator, how do I derive a backward formula for it?
|
||||
|
||||
How to derive formula for torch.sin (real case) [link](https://colab.research.google.com/drive/1lUU5JUh0h-8XwaavyLuOkQfeQgn4m8zr).
|
||||
How to derive formula for torch.mm (real case) [link](https://colab.research.google.com/drive/1z6641HKB51OfYJMCxOFo0lYd7viytnIG).
|
||||
|
||||
## Given a new operator, how do I write a new backward formula? (using derivatives.yaml)
|
||||
|
||||
Coming soon!
|
||||
|
||||
## When should I write a new backward formula?
|
||||
|
||||
Coming soon!
|
||||
|
||||
## How do I test an autograd formula?
|
||||
|
||||
Coming soon!
|
||||
|
||||
## What are custom autograd functions?
|
||||
|
||||
Coming soon!
|
||||
|
||||
## Lab: Derive the backward formula for a fake operator
|
||||
|
||||
Coming soon!
|
||||
|
||||
|
||||
|
Reference in New Issue
Block a user