Files
transformers/docs/source/en/model_doc/cpmant.md
Yuanyuan Chen 374ded5ea4 Fix white space in documentation (#41157)
* Fix white space

Signed-off-by: Yuanyuan Chen <cyyever@outlook.com>

* Revert changes

Signed-off-by: Yuanyuan Chen <cyyever@outlook.com>

* Fix autodoc

Signed-off-by: Yuanyuan Chen <cyyever@outlook.com>

---------

Signed-off-by: Yuanyuan Chen <cyyever@outlook.com>
2025-09-30 09:41:03 -07:00

2.0 KiB

This model was released on 2022-09-16 and added to Hugging Face Transformers on 2023-04-12.

CPMAnt

PyTorch

Overview

CPM-Ant is an open-source Chinese pre-trained language model (PLM) with 10B parameters. It is also the first milestone of the live training process of CPM-Live. The training process is cost-effective and environment-friendly. CPM-Ant also achieves promising results with delta tuning on the CUGE benchmark. Besides the full model, we also provide various compressed versions to meet the requirements of different hardware configurations. See more

This model was contributed by OpenBMB. The original code can be found here.

Resources

CpmAntConfig

autodoc CpmAntConfig - all

CpmAntTokenizer

autodoc CpmAntTokenizer - all

CpmAntModel

autodoc CpmAntModel - all

CpmAntForCausalLM

autodoc CpmAntForCausalLM - all