mirror of
https://github.com/pytorch/pytorch.git
synced 2025-10-21 05:34:18 +08:00
Skip test responsible for causing flakiness (#145109)
Investigation is a separate issue. For now I want to get the CI back up and running on the other tests. The problem seems to be that IncludeDispatchKeyGuard doesn't actually reset the state, which seems very, very wrong. Pull Request resolved: https://github.com/pytorch/pytorch/pull/145109 Approved by: https://github.com/williamwen42
This commit is contained in:
@ -202,6 +202,9 @@ class TestPythonRegistration(TestCase):
|
||||
self.assertEqual(c, a + b)
|
||||
self.assertTrue(is_called)
|
||||
|
||||
@unittest.skip(
|
||||
"Causing flakiness, see https://github.com/pytorch/pytorch/issues/145108"
|
||||
)
|
||||
def test_fallthrough_for_dense_key_with_meta_in_tls(self) -> None:
|
||||
# This tests that if meta is included in TlS dispatch key set,
|
||||
# then a meta kernel should be called regardless if a dense
|
||||
|
Reference in New Issue
Block a user