This website uses cookies to anonymously analyze website traffic using Google Analytics.

Models / Mistral AIMistral /  / Mistral Small 3 API

Mistral Small 3 API

24B model rivaling GPT-4o mini, and larger models like Llama 3.3 70B. Ideal for chat use cases like customer support, translation and summarization.

Try our Mistral Small 3 API

Mistral Small 3 API Usage

Endpoint

RUN INFERENCE

import Together from "together-ai";

const together = new Together();

const response = await together.chat.completions.create({
  messages: [
    {
      role: "user",
      content: "What are some fun things to do in New York?"
    }
  ],
  model: "meta-llama/Llama-Vision-Free"
});

console.log(response.choices[0].message.content)

RUN INFERENCE

from together import Together

client = Together()

response = client.chat.completions.create(
    model="mistralai/Mistral-Small-24B-Instruct-2501",
    messages=[
      {
        "role": "user",
        "content": "What are some fun things to do in New York?"
      }
    ]
)
print(response.choices[0].message.content)

RUN INFERENCE

import Together from "together-ai";

const together = new Together();

const response = await together.chat.completions.create({
  messages: [
    {
      role: "user",
      content: "What are some fun things to do in New York?"
    }
  ],
  model: "mistralai/Mistral-Small-24B-Instruct-2501"
});

console.log(response.choices[0].message.content)

How to use Mistral Small 3

Model details

Prompting Mistral Small 3

Applications & Use Cases

Looking for production scale? Deploy on a dedicated endpoint

Deploy Mistral Small 3 on a dedicated endpoint with custom hardware configuration, as many instances as you need, and auto-scaling.

Get started