mirror of
https://github.com/pytorch/pytorch.git
synced 2025-11-11 22:34:53 +08:00
Summary: Upgrader should only be initialized once when runtime loads the first module. It no longer needs to initialized afterwards. Previously, instead of using an atomic variable, the upgrader will be initialized depends on whether byteCodeFunctionWithOperator.function.get_code().operators_ is empty. If it's empty, it means the operator from the upgrader is not initialized yet. However, it's not thread safe. When multiple thread loads module together, it's possible that they all consider it's the first module. Use an atomic variable here to make sure it's thread safe. Pull Request resolved: https://github.com/pytorch/pytorch/pull/70161 ghstack-source-id: 146012642 Test Plan: ``` buck test mode/opt //papaya/integration/service/test/analytics/histogram:generic_histogram_system_test -- --exact 'papaya/integration/service/test/analytics/histogram:generic_histogram_system_test - SumHistogramSystemTest.test' --run-disabled buck test mode/opt //caffe2/test/cpp/jit:jit ``` Reviewed By: iseeyuan Differential Revision: D33220320 fbshipit-source-id: 10f2397c3b358d5a1d39a2ce25457e3fdb640d2c