[FP8][CUDA] Fix stale expected error message (#136581)

CC @nWEIdia as I think we have seen internal failures on this

Pull Request resolved: https://github.com/pytorch/pytorch/pull/136581
Approved by: https://github.com/mikaylagawarecki
This commit is contained in:
eqy
2024-09-26 20:57:35 +00:00
committed by PyTorch MergeBot
parent 5789f8d5dc
commit c0e98a485b

View File

@ -655,7 +655,7 @@ class TestFP8MatmulCuda(TestCase):
with self.assertRaisesRegex(
RuntimeError,
re.escape("For RowWise scaling the second input is required to be a float8_e4m3fn dtype."),
re.escape("Expected b.dtype() == at::kFloat8_e4m3fn to be true, but got false."),
):
torch._scaled_mm(
x_fp8,