M2-BERT 80M 32K Retrieval
80M checkpoint of M2-BERT, pretrained with sequence length 32768, and it has been fine-tuned for long-context retrieval.
About model
M2-BERT 80M 32K Retrieval generates embeddings for long-context retrieval tasks, leveraging its 80M parameters and 32K sequence length. It is suitable for applications requiring efficient information retrieval. Developed by Jon Saad-Falcon, Dan Fu, and Simran Arora.
To use this embeddings model, please follow the instructions from our Docs.
- Model providerTogether AI
- TypeEmbeddings
- Main use casesEmbeddings
- DeploymentMonthly Reserved
- Parameters80M
- Context length32768
- Input price
$0.01 / 1M tokens
- Input modalitiesText
- Output modalitiesStructured Data
- ReleasedNovember 3, 2023
- Last updatedFebruary 5, 2026
- External link
- CategoryEmbeddings