Files
transformers/docs/source/en/model_doc/bloom.md
Yuanyuan Chen f64354e89a Format empty lines and white space in markdown files. (#41100)
* Remove additional white space and empty lines from markdown files

Signed-off-by: Yuanyuan Chen <cyyever@outlook.com>

* Add empty lines around code

Signed-off-by: Yuanyuan Chen <cyyever@outlook.com>

---------

Signed-off-by: Yuanyuan Chen <cyyever@outlook.com>
2025-09-23 16:20:01 -07:00

3.8 KiB
Raw Blame History

This model was released on 2022-11-09 and added to Hugging Face Transformers on 2022-06-09.

BLOOM

PyTorch

Overview

The BLOOM model has been proposed with its various versions through the BigScience Workshop. BigScience is inspired by other open science initiatives where researchers have pooled their time and resources to collectively achieve a higher impact. The architecture of BLOOM is essentially similar to GPT3 (auto-regressive model for next token prediction), but has been trained on 46 different languages and 13 programming languages. Several smaller versions of the models have been trained on the same dataset. BLOOM is available in the following versions:

Resources

A list of official Hugging Face and community (indicated by 🌎) resources to help you get started with BLOOM. If you're interested in submitting a resource to be included here, please feel free to open a Pull Request and we'll review it! The resource should ideally demonstrate something new instead of duplicating an existing resource.

See also:

Inference

⚙️ Training

BloomConfig

autodoc BloomConfig - all

BloomTokenizerFast

autodoc BloomTokenizerFast - all

BloomModel

autodoc BloomModel - forward

BloomForCausalLM

autodoc BloomForCausalLM - forward

BloomForSequenceClassification

autodoc BloomForSequenceClassification - forward

BloomForTokenClassification

autodoc BloomForTokenClassification - forward

BloomForQuestionAnswering

autodoc BloomForQuestionAnswering - forward