Reference: https://docs.astral.sh/ruff/formatter/#f-string-formatting - Change the outer quotes to double quotes for nested f-strings ```diff - f'{", ".join(args)}' + f"{', '.join(args)}" ``` - Change the inner quotes to double quotes for triple f-strings ```diff string = """ - {', '.join(args)} + {", ".join(args)} """ ``` - Join implicitly concatenated strings ```diff - string = "short string " "short string " f"{var}" + string = f"short string short string {var}" ``` Pull Request resolved: https://github.com/pytorch/pytorch/pull/144569 Approved by: https://github.com/Skylion007 ghstack dependencies: #146509
Differential Privacy with ResNet18
Differential Privacy
Differential privacy is a way of training models that ensures no attacker can figure out the training data from the gradient updates of the model. Recently, a paper was published comparing the performance of Opacus to a JAX-based system.
Original differential privacy paper JAX-based differential privacy paper
Opacus
Opacus is a differential privacy library built for PyTorch. They have added hooks to PyTorch's autograd that compute per sample gradients and a differential privacy engine that computes differentially private weight updates.
Example
This example runs ResNet18 by either having Opacus compute the differentially private updates or getting the per sample gradients using vmap and grad and computing the differentially private update from those.
As a caveat, the transforms version may not be computing the exact same values as the opacus version. No verification has been done yet for this.
Requirements
These examples use Opacus version 1.0.1 and torchvision 0.11.2