Updated Autograd Basics (markdown)

jhelsby
2024-08-31 14:41:19 +01:00
parent 922059ad9b
commit ccb2a11c88

@ -18,7 +18,7 @@ You can use the table below to see what is the best solution for you.
The next sections in this document give more information on each solution.
All terms are defined below the table.
| Where \ Level | Tensor | Third party lib | Raw data pointer access |
| Where \ Level | Tensor | Third party library | Raw data pointer access |
| ----------- | ----------- | ----------- | ----------- |
| Aten native functions | Composite | derivatives.yaml | derivatives.yaml
| In pytorch/pytorch outside of aten | Composite | Custom Function | Custom Function |
@ -26,7 +26,7 @@ All terms are defined below the table.
Definition of the Levels:
- "Tensor" means that your function is always working with PyTorch Tensors and using differentiable ops.
- "Third party lib" means that your function uses a third party lib that doesn't work with Tensors directly (numpy, CUDA, etc).
- "Third party library" means that your function uses a third party library that doesn't work with Tensors directly (numpy, CUDA, etc).
- "Raw data pointer access" means that your function extracts the data_ptr from the Tensor and work with that directly (for example our c++ implementations of many functions).
Definition of the Wheres: