Revert "Fix setUpClass() / tearDownClass() for device-specific tests (#151129)"

This reverts commit bd4cf30e31a2a0b0a57f54c7eedd3a39d5778cbe.

Reverted https://github.com/pytorch/pytorch/pull/151129 on behalf of https://github.com/jbschlosser due to flex attention tests failing ([comment](https://github.com/pytorch/pytorch/pull/151129#issuecomment-2807632119))
This commit is contained in:
PyTorch MergeBot
2025-04-15 22:07:25 +00:00
parent e1d8b3f838
commit 98b1e82ba8
5 changed files with 70 additions and 73 deletions

View File

@ -125,12 +125,12 @@ class TestLinalg(TestCase):
del os.environ["HIPBLASLT_ALLOW_TF32"]
def setUp(self):
super().setUp()
super(self.__class__, self).setUp()
torch.backends.cuda.matmul.allow_tf32 = False
def tearDown(self):
torch.backends.cuda.matmul.allow_tf32 = True
super().tearDown()
super(self.__class__, self).tearDown()
@contextlib.contextmanager
def _tunableop_ctx(self):