Models / Together AI
Embeddings

M2-BERT 80M 32K Retrieval

80M checkpoint of M2-BERT, pretrained with sequence length 32768, and it has been fine-tuned for long-context retrieval.

This model is not available on Together’s Serverless API.

Pick a supported alternative from the Model Library.

Related models
  • Model provider
    Together AI
  • Type
    Embeddings
  • Main use cases
    Embeddings
  • Parameters
    80M
  • Context length
    32768