mirror of
https://github.com/pytorch/pytorch.git
synced 2025-10-20 21:14:14 +08:00
Change BN to eval before QAT Convert phase (#130598)
**Summary** In the QAT convert phase, we fold bn into conv and do DCE to this BN node. We should change `torch.ops.aten._native_batch_norm_legit.default` to `torch.ops.aten._native_batch_norm_legit_no_training.default` for a safe DCE. Pull Request resolved: https://github.com/pytorch/pytorch/pull/130598 Approved by: https://github.com/jgong5, https://github.com/yushangdi
This commit is contained in:
committed by
PyTorch MergeBot
parent
18418a7dbb
commit
2a1f22e57f
@ -2958,6 +2958,6 @@ def _generate_qdq_quantized_model(
|
||||
else prepare_pt2e(export_model, quantizer)
|
||||
)
|
||||
prepare_model(*inputs)
|
||||
torch.ao.quantization.move_exported_model_to_eval(prepare_model)
|
||||
convert_model = convert_pt2e(prepare_model)
|
||||
torch.ao.quantization.move_exported_model_to_eval(convert_model)
|
||||
return convert_model
|
||||
|
Reference in New Issue
Block a user