Files
transformers/docs/source/en/model_doc/retribert.md
2025-10-15 14:08:54 -07:00

2.1 KiB

This model was released on 2020-06-12 and added to Hugging Face Transformers on 2023-06-20 and contributed by yjernite.

Warning

This model is in maintenance mode only, so we won't accept any new PRs changing its code.

If you run into any issues running this model, please reinstall the last version that supported this model: v4.30.0. You can do so by running the following command: pip install -U transformers==4.30.0.

RetriBERT

RetriBERT is a compact model designed for dense semantic indexing, utilizing either a single or a pair of BERT encoders with a reduced-dimensional projection layer. This architecture enables efficient retrieval of relevant passages by encoding text into dense vectors. It was developed to facilitate open-domain long-form question answering (LFQA) tasks, particularly when training data is limited. By leveraging the ELI5 dataset, RetriBERT demonstrates how dense retrieval systems can be trained without extensive supervision or task-specific pretraining, making such models more accessible.

RetriBertConfig

autodoc RetriBertConfig

RetriBertTokenizer

autodoc RetriBertTokenizer

RetriBertTokenizerFast

autodoc RetriBertTokenizerFast

RetriBertModel

autodoc RetriBertModel - forward