M2-BERT 80M 32K Retrieval
80M checkpoint of M2-BERT, pretrained with sequence length 32768, and it has been fine-tuned for long-context retrieval.
This model is not available on Together’s Serverless API.
Pick a supported alternative from the Model Library.
Related models
- Model providerTogether AI
- TypeEmbeddings
- Main use casesEmbeddings
- Parameters80M
- Context length32768
- External link
- CategoryEmbeddings