mirror of
https://github.com/pytorch/pytorch.git
synced 2025-10-20 21:14:14 +08:00
Updated PyTorch OSS benchmark infra (markdown)
@ -76,7 +76,9 @@ Note that the JSON list is optional. Writing JSON record one per line ([JSONEach
|
||||
|
||||
### Upload API
|
||||
|
||||
The quickest way to upload the benchmark results is to use the [upload_benchmark_results.py](https://github.com/pytorch/pytorch-integration-testing/blob/main/.github/scripts/upload_benchmark_results.py) script. The script requires an `UPLOADER_[USERNAME|PASSWORD]` credential at the moment, so please reach out to PyTorch Dev Infra if you need to use it. For example,
|
||||
The quickest way to upload the benchmark results is to use the [upload_benchmark_results.py](https://github.com/pytorch/pytorch-integration-testing/blob/main/.github/scripts/upload_benchmark_results.py) script. The script requires an `UPLOADER_[USERNAME|PASSWORD]` credential at the moment, so please reach out to PyTorch Dev Infra if you need to use it. Once written to the database, the benchmark results could be considered immutable because it's a complicated and expensive process to update or delete them.
|
||||
|
||||
Here is an example usage:
|
||||
|
||||
```
|
||||
export UPLOADER_USERNAME=<REDACT>
|
||||
@ -94,6 +96,7 @@ pip install -r requirements.txt
|
||||
# --repo is where the repo is checkout and built
|
||||
# --benchmark-name is an unique string that is used to identify the benchmark
|
||||
# --benchmark-results is where the JSON benchmark result files are kept
|
||||
# --dry-run is to prepare everything except writing the results to S3
|
||||
python upload_benchmark_results.py \
|
||||
--repo pytorch \
|
||||
--benchmark-name "My PyTorch benchmark" \
|
||||
|
Reference in New Issue
Block a user