Files
pytorch/test/test_import_time.py
Sam Estep 1e9c7ad4cb Add a test to measure import torch time (#56041)
Summary:
This PR adds a couple very simple tests which (as the code comment says) measure the time it takes to `import torch` and ask for the CUDA device count.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/56041

Test Plan:
```
$ rm -r /tmp/reports ; python3 test/test_import_time.py --save-xml=/tmp/reports

Running tests...
----------------------------------------------------------------------
..
----------------------------------------------------------------------
Ran 2 tests in 1.855s

OK

Generating XML reports...
```
```
$ tools/print_test_stats.py /tmp/reports
No scribe access token provided, skip sending report!
class TestImportTime:
    tests: 2 failed: 0 skipped: 0 errored: 0
    run_time: 1.85 seconds
    avg_time: 0.93 seconds
    median_time: 0.93 seconds
    2 longest tests:
        test_time_cuda_device_count time: 1.10 seconds
        test_time_import_torch time: 0.75 seconds

Total runtime is 0:00:01
2 longest tests of entire run:
    TestImportTime.test_time_cuda_device_count  time: 1.10 seconds
    TestImportTime.test_time_import_torch  time: 0.75 seconds
```

Reviewed By: driazati

Differential Revision: D27770908

Pulled By: samestep

fbshipit-source-id: 01bbf5a339f41d3a1f493e6fa8c946ff7567daec
2021-04-15 00:53:30 -07:00

20 lines
662 B
Python

from torch.testing._internal.common_utils import TestCase, run_tests
# these tests could eventually be changed to fail if the import/init
# time is greater than a certain threshold, but for now we just use them
# as a way to track the duration of `import torch` in our ossci-metrics
# S3 bucket (see tools/print_test_stats.py)
class TestImportTime(TestCase):
def test_time_import_torch(self):
TestCase.runWithPytorchAPIUsageStderr('import torch')
def test_time_cuda_device_count(self):
TestCase.runWithPytorchAPIUsageStderr(
'import torch; torch.cuda.device_count()',
)
if __name__ == '__main__':
run_tests()