mirror of
https://github.com/pytorch/pytorch.git
synced 2025-10-20 21:14:14 +08:00
Updated Autograd Basics (markdown)
@ -18,7 +18,7 @@ You can use the table below to see what is the best solution for you.
|
||||
The next sections in this document give more information on each solution.
|
||||
All terms are defined below the table.
|
||||
|
||||
| Where \ Level | Tensor | Third party lib | Raw data pointer access |
|
||||
| Where \ Level | Tensor | Third party library | Raw data pointer access |
|
||||
| ----------- | ----------- | ----------- | ----------- |
|
||||
| Aten native functions | Composite | derivatives.yaml | derivatives.yaml
|
||||
| In pytorch/pytorch outside of aten | Composite | Custom Function | Custom Function |
|
||||
@ -26,7 +26,7 @@ All terms are defined below the table.
|
||||
|
||||
Definition of the Levels:
|
||||
- "Tensor" means that your function is always working with PyTorch Tensors and using differentiable ops.
|
||||
- "Third party lib" means that your function uses a third party lib that doesn't work with Tensors directly (numpy, CUDA, etc).
|
||||
- "Third party library" means that your function uses a third party library that doesn't work with Tensors directly (numpy, CUDA, etc).
|
||||
- "Raw data pointer access" means that your function extracts the data_ptr from the Tensor and work with that directly (for example our c++ implementations of many functions).
|
||||
|
||||
Definition of the Wheres:
|
||||
|
Reference in New Issue
Block a user