mirror of
https://github.com/pytorch/pytorch.git
synced 2025-10-20 12:54:11 +08:00
Summary: This PR is a large codemod to rewrite all C++ API tests with GoogleTest (gtest) instead of Catch. You can largely trust me to have correctly code-modded the tests, so it's not required to review every of the 2000+ changed lines. However, additional things I changed were: 1. Moved the cmake parts for these tests into their own `CMakeLists.txt` under `test/cpp/api` and calling `add_subdirectory` from `torch/CMakeLists.txt` 2. Fixing DataParallel tests which weren't being compiled because `USE_CUDA` wasn't correctly being set at all. 3. Updated README ezyang ebetica Pull Request resolved: https://github.com/pytorch/pytorch/pull/11953 Differential Revision: D9998883 Pulled By: goldsborough fbshipit-source-id: affe3f320b0ca63e7e0019926a59076bb943db80
36 lines
1.0 KiB
Markdown
36 lines
1.0 KiB
Markdown
# C++ Frontend Tests
|
|
|
|
In this folder live the tests for PyTorch's C++ Frontend. They use the
|
|
[GoogleTest](https://github.com/google/googletest) test framework.
|
|
|
|
## CUDA Tests
|
|
|
|
To make a test runnable only on platforms with CUDA, you should suffix your
|
|
test with `_CUDA`, e.g.
|
|
|
|
```cpp
|
|
TEST(MyTestSuite, MyTestCase_CUDA) { }
|
|
```
|
|
|
|
To make it runnable only on platforms with at least two CUDA machines, suffix
|
|
it with `_MultiCUDA` instead of `_CUDA`, e.g.
|
|
|
|
```cpp
|
|
TEST(MyTestSuite, MyTestCase_MultiCUDA) { }
|
|
```
|
|
|
|
There is logic in `main.cpp` that detects the availability and number of CUDA
|
|
devices and supplies the appropriate negative filters to GoogleTest.
|
|
|
|
## Integration Tests
|
|
|
|
Integration tests use the MNIST dataset. You must download it by running the
|
|
following command from the PyTorch root folder:
|
|
|
|
```sh
|
|
$ python tools/download_mnist.py -d test/cpp/api/mnist
|
|
```
|
|
|
|
The required paths will be referenced as `test/cpp/api/mnist/...` in the test
|
|
code, so you *must* run the integration tests from the PyTorch root folder.
|