mirror of
https://github.com/pytorch/pytorch.git
synced 2025-10-20 21:14:14 +08:00
# Motivation Fix https://github.com/pytorch/pytorch/issues/152301 When XPU is not available, calling `torch.xpu.is_bf16_supported()` still returns `True`, which is inconsistent with the expected behavior (should be False). # Solution Align to other backend, adding `including_emulation` to `torch.xpu.is_bf16_supported` and, - return `False` if XPU is not available - return `True` if `including_emulation` is True - return `torch.xpu.get_device_properties().has_bfloat16_conversions` if `including_emulation` is False, it means if the device could generate SPIRV code for bf16. Pull Request resolved: https://github.com/pytorch/pytorch/pull/152317 Approved by: https://github.com/EikanWang