mirror of
https://github.com/pytorch/pytorch.git
synced 2025-10-20 12:54:11 +08:00
Page:
function transforms (aka torch.func, functorch)
Pages
A quick guide on how to add and cache dependencies on PyTorch CI
Adding and maintaining type annotations in PyTorch
Autograd Basics
Autograd Onboarding Lab
Best Practices to Edit and Compile PyTorch Source Code On Windows
Bot commands
Boxing and Unboxing in the PyTorch Operator Library
Build PyTorch and LibTorch on Windows ARM64
CUDA basics
Code review values
Codegen and Structured Kernels
Continuous Integration
Core Frontend Onboarding
Cpp API Quick Walkthrough
Data Basics
DataPipes Testing Requirements
Debugging CI Failures without SSH Access
Debugging Windows with Remote Desktop or CDB (CLI windbg) on CircleCI
Debugging using with ssh for Github Actions
Dev Infra Office Hours
Developer FAQ
Dispatcher Structured Kernels Lab
Docker image build on CircleCI
Docstring Guidelines
Getting help as a contributor
Guide for adding type annotations to PyTorch
Home
How to propose feature changes to PyTorch
How to support `torch.set_deterministic()` in PyTorch operators
How to use TensorIterator
Life of a Tensor
MPS Backend
Memory format propagation rules
Modular components for benchmarking PyTorch snippets. (Experimental)
Module Onboarding Lab
Multiprocessing Technical Notes
OpInfos FAQ
Operators with Channels Last support
Public API definition and documentation
Pull request review etiquette
PyTorch's Python Frontend Backward and Forward Compatibility Policy
PyTorch AutoLabel Bot
PyTorch Basics
PyTorch CI Metrics Dashboards: the HUD
PyTorch Data Flow and Interface Diagram
PyTorch IR
PyTorch ONNX Exporter Code Reviews and Duty Rotation
PyTorch ONNX Topics
PyTorch ONNX exporter
PyTorch OSS benchmark infra
PyTorch Ops to oneDNN Functions Mapping
PyTorch Versions
PyTorch Workflow Cheatsheet
PyTorch dispatcher walkthrough
Pytorch Training Loops
Running and writing tests
Sharing design documents for discussion
Software Architecture for c10
TH to ATen porting guide
Tensor and Operator Basics
The PyTorch Contribution Process
The Ultimate Guide to PyTorch Contributions
The torch.fft module in PyTorch 1.7
Troubleshooting Documentation Build
Troubleshooting
Using hud.pytorch.org
What is considered a SEV?
Where or how should I add documentation
Writing Python in cpp (a manifesto)
Writing memory format aware operators
Writing tests in PyTorch 1.8
clang format
function transforms (aka torch.func, functorch)
lintrunner
nn Basics
torch.nn Module Documentation Style Guide
torch.onnx Namespacing
vmap Basics
vmap Onboarding Lab
Clone
3
function transforms (aka torch.func, functorch)
Manuel edited this page 2024-07-03 19:36:27 +02:00
Page Maintainers: @zou3519
Scope
- understand what composable function transforms are and their most common use cases
- understand what DynamicLayerStack is and how it is used to implement composition of function transforms
Learn about function transforms
- Read through the whirlwind tour
- Read through the advanced autodiff tutorial
- Read through the per-sample-gradients tutorial
- Read through the model ensembling tutorial
Exercise
The advanced autodiff tutorial explains how to compute Jacobians via a composition of vmap and vjp.
- Without looking at the source code for jacfwd or torch.autograd.functional.jacobian, write a function to compute the Jacobian using forward-mode AD and a for-loop. Note that forward-mode AD computes Jacobian-vector products while reverse-mode AD (vjp, grad) compute vector-Jacobian products.
- Write a function to compute the Jacobian by composing vmap and jvp.
The APIs should have the following signature:
def jacobian(f, *args):
pass
You can assume that f
accepts multiple Tensor arguments and returns a single Tensor argument.
Understand how PyTorch implements composable function transforms
Read through this gdoc.
Next
Back to the Core Frontend Onboarding
I would love to contribute to PyTorch!