mirror of
https://github.com/pytorch/pytorch.git
synced 2025-10-20 21:14:14 +08:00
Revert "Make Context to be Device-agnostic Step by Step (1/N) (#136519)"
This reverts commit 4a8e49389c33934234dc89616fd17a58e760e2e7. Reverted https://github.com/pytorch/pytorch/pull/136519 on behalf of https://github.com/clee2000 due to breaking internal tests related to MITA, @ezyang has a forward fix? ([comment](https://github.com/pytorch/pytorch/pull/136519#issuecomment-2414588302))
This commit is contained in:
@ -515,7 +515,9 @@ return {sig.name()}({', '.join(e.expr for e in translate(cpp_sig.arguments(), si
|
||||
|
||||
# CUDA requires special handling
|
||||
if is_cuda_dispatch_key(self.backend_index.dispatch_key):
|
||||
device_guard = f"globalContext().lazyInitDevice(c10::DeviceType::CUDA);\n{device_guard}"
|
||||
device_guard = (
|
||||
f"globalContext().lazyInitCUDA();\n{device_guard}"
|
||||
)
|
||||
else:
|
||||
# kernel is operating on existing tensors
|
||||
|
||||
|
Reference in New Issue
Block a user