Unpin transformers version for most workflows (#7139)

Unpin transformers version for all workflows except
`nv-torch-latest-v100` as this still has a tolerance issue with some
quantization tests.

Signed-off-by: Logan Adams <loadams@microsoft.com>
This commit is contained in:
Logan Adams
2025-03-14 13:52:44 -07:00
committed by GitHub
parent 39027c3008
commit d095b18185
4 changed files with 4 additions and 4 deletions

View File

@ -42,7 +42,7 @@ jobs:
git clone https://github.com/huggingface/transformers
cd transformers
# if needed switch to the last known good SHA until transformers@master is fixed
git checkout 981c276
# git checkout 981c276
git rev-parse --short HEAD
pip install .

View File

@ -112,7 +112,7 @@ jobs:
git clone https://github.com/huggingface/transformers
cd transformers
# if needed switch to the last known good SHA until transformers@master is fixed
git checkout 981c276
# git checkout 981c276
git rev-parse --short HEAD
pip install .

View File

@ -43,7 +43,7 @@ jobs:
git clone https://github.com/huggingface/transformers
cd transformers
# if you need to use an older transformers version temporarily in case of breakage
git checkout 981c276
# git checkout 981c276
git rev-parse --short HEAD
python -m pip install .
- name: Install deepspeed

View File

@ -37,7 +37,7 @@ jobs:
git clone https://github.com/huggingface/transformers
cd transformers
# if needed switch to the last known good SHA until transformers@master is fixed
git checkout 981c276
# git checkout 981c276
git rev-parse --short HEAD
pip install .