Models / Together AI
Embeddings

M2-BERT 80M 32K Retrieval

80M checkpoint of M2-BERT, pretrained with sequence length 32768, and it has been fine-tuned for long-context retrieval.

About model

M2-BERT 80M 32K Retrieval generates embeddings for long-context retrieval tasks, leveraging its 80M parameters and 32K sequence length. It is suitable for applications requiring efficient information retrieval. Developed by Jon Saad-Falcon, Dan Fu, and Simran Arora.

To use this embeddings model, please follow the instructions from our Docs.

    Related models
    • Model provider
      Together AI
    • Type
      Embeddings
    • Main use cases
      Embeddings
    • Deployment
      Monthly Reserved
    • Parameters
      80M
    • Context length
      32768
    • Input price

      $0.01 / 1M tokens

    • Input modalities
      Text
    • Output modalities
      Structured Data
    • Released
      November 3, 2023
    • Last updated
      February 5, 2026
    • External link
    • Category
      Embeddings