Models / Mistral AI / / Ministral 3 8B Instruct 2512 API
Ministral 3 8B Instruct 2512 API
Balanced 8B multimodal model for versatile assistants, agents, and multilingual understanding.

This model is not currently supported on Together AI.
Visit our Models page to view all the latest models.
Introducing Ministral 3 8B Instruct 2512
Ministral 3 8B Instruct is Mistral AI’s balanced 8B-class multimodal assistant, pairing an 8.4B language backbone with a 0.4B vision encoder for everyday text–image reasoning. With a 256K token context window, it handles long conversations, multi-document analysis, and tool-augmented workflows while staying fast and cost-efficient for broad deployment.
Ministral 3 8B Instruct 2512 API Usage
Endpoint
curl -X POST "https://api.together.xyz/v1/chat/completions" \
-H "Authorization: Bearer $TOGETHER_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "mistralai/Ministral-3-8B-Instruct-2512",
"messages": [
{
"role": "user",
"content": "What are some fun things to do in New York?"
}
]
}'
curl -X POST "https://api.together.xyz/v1/images/generations" \
-H "Authorization: Bearer $TOGETHER_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "mistralai/Ministral-3-8B-Instruct-2512",
"prompt": "Draw an anime style version of this image.",
"width": 1024,
"height": 768,
"steps": 28,
"n": 1,
"response_format": "url",
"image_url": "https://huggingface.co/datasets/patrickvonplaten/random_img/resolve/main/yosemite.png"
}'
curl -X POST https://api.together.xyz/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $TOGETHER_API_KEY" \
-d '{
"model": "mistralai/Ministral-3-8B-Instruct-2512",
"messages": [{
"role": "user",
"content": [
{"type": "text", "text": "Describe what you see in this image."},
{"type": "image_url", "image_url": {"url": "https://huggingface.co/datasets/patrickvonplaten/random_img/resolve/main/yosemite.png"}}
]
}],
"max_tokens": 512
}'
curl -X POST https://api.together.xyz/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $TOGETHER_API_KEY" \
-d '{
"model": "mistralai/Ministral-3-8B-Instruct-2512",
"messages": [{
"role": "user",
"content": "Given two binary strings `a` and `b`, return their sum as a binary string"
}]
}'
curl -X POST https://api.together.xyz/v1/rerank \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $TOGETHER_API_KEY" \
-d '{
"model": "mistralai/Ministral-3-8B-Instruct-2512",
"query": "What animals can I find near Peru?",
"documents": [
"The giant panda (Ailuropoda melanoleuca), also known as the panda bear or simply panda, is a bear species endemic to China.",
"The llama is a domesticated South American camelid, widely used as a meat and pack animal by Andean cultures since the pre-Columbian era.",
"The wild Bactrian camel (Camelus ferus) is an endangered species of camel endemic to Northwest China and southwestern Mongolia.",
"The guanaco is a camelid native to South America, closely related to the llama. Guanacos are one of two wild South American camelids; the other species is the vicuña, which lives at higher elevations."
],
"top_n": 2
}'
curl -X POST https://api.together.xyz/v1/embeddings \
-H "Authorization: Bearer $TOGETHER_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"input": "Our solar system orbits the Milky Way galaxy at about 515,000 mph.",
"model": "mistralai/Ministral-3-8B-Instruct-2512"
}'
curl -X POST https://api.together.xyz/v1/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $TOGETHER_API_KEY" \
-d '{
"model": "meta-llama/Llama-4-Maverick-17B-128E-Instruct-FP8",
"prompt": "A horse is a horse",
"max_tokens": 32,
"temperature": 0.1,
"safety_model": "mistralai/Ministral-3-8B-Instruct-2512"
}'
curl --location 'https://api.together.ai/v1/audio/generations' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer $TOGETHER_API_KEY' \
--output speech.mp3 \
--data '{
"input": "Today is a wonderful day to build something people love!",
"voice": "helpful woman",
"response_format": "mp3",
"sample_rate": 44100,
"stream": false,
"model": "mistralai/Ministral-3-8B-Instruct-2512"
}'
curl -X POST "https://api.together.xyz/v1/audio/transcriptions" \
-H "Authorization: Bearer $TOGETHER_API_KEY" \
-F "model=mistralai/Ministral-3-8B-Instruct-2512" \
-F "language=en" \
-F "response_format=json" \
-F "timestamp_granularities=segment"
curl --request POST \
--url https://api.together.xyz/v2/videos \
--header "Authorization: Bearer $TOGETHER_API_KEY" \
--header "Content-Type: application/json" \
--data '{
"model": "mistralai/Ministral-3-8B-Instruct-2512",
"prompt": "some penguins building a snowman"
}'
curl --request POST \
--url https://api.together.xyz/v2/videos \
--header "Authorization: Bearer $TOGETHER_API_KEY" \
--header "Content-Type: application/json" \
--data '{
"model": "mistralai/Ministral-3-8B-Instruct-2512",
"frame_images": [{"input_image": "https://cdn.pixabay.com/photo/2020/05/20/08/27/cat-5195431_1280.jpg"}]
}'
from together import Together
client = Together()
response = client.chat.completions.create(
model="mistralai/Ministral-3-8B-Instruct-2512",
messages=[
{
"role": "user",
"content": "What are some fun things to do in New York?"
}
]
)
print(response.choices[0].message.content)
from together import Together
client = Together()
imageCompletion = client.images.generate(
model="mistralai/Ministral-3-8B-Instruct-2512",
width=1024,
height=768,
steps=28,
prompt="Draw an anime style version of this image.",
image_url="https://huggingface.co/datasets/patrickvonplaten/random_img/resolve/main/yosemite.png",
)
print(imageCompletion.data[0].url)
from together import Together
client = Together()
response = client.chat.completions.create(
model="mistralai/Ministral-3-8B-Instruct-2512",
messages=[{
"role": "user",
"content": [
{"type": "text", "text": "Describe what you see in this image."},
{"type": "image_url", "image_url": {"url": "https://huggingface.co/datasets/patrickvonplaten/random_img/resolve/main/yosemite.png"}}
]
}]
)
print(response.choices[0].message.content)
from together import Together
client = Together()
response = client.chat.completions.create(
model="mistralai/Ministral-3-8B-Instruct-2512",
messages=[
{
"role": "user",
"content": "Given two binary strings `a` and `b`, return their sum as a binary string"
}
],
)
print(response.choices[0].message.content)
from together import Together
client = Together()
query = "What animals can I find near Peru?"
documents = [
"The giant panda (Ailuropoda melanoleuca), also known as the panda bear or simply panda, is a bear species endemic to China.",
"The llama is a domesticated South American camelid, widely used as a meat and pack animal by Andean cultures since the pre-Columbian era.",
"The wild Bactrian camel (Camelus ferus) is an endangered species of camel endemic to Northwest China and southwestern Mongolia.",
"The guanaco is a camelid native to South America, closely related to the llama. Guanacos are one of two wild South American camelids; the other species is the vicuña, which lives at higher elevations.",
]
response = client.rerank.create(
model="mistralai/Ministral-3-8B-Instruct-2512",
query=query,
documents=documents,
top_n=2
)
for result in response.results:
print(f"Relevance Score: {result.relevance_score}")
from together import Together
client = Together()
response = client.embeddings.create(
model = "mistralai/Ministral-3-8B-Instruct-2512",
input = "Our solar system orbits the Milky Way galaxy at about 515,000 mph"
)
from together import Together
client = Together()
response = client.completions.create(
model="meta-llama/Llama-4-Maverick-17B-128E-Instruct-FP8",
prompt="A horse is a horse",
max_tokens=32,
temperature=0.1,
safety_model="mistralai/Ministral-3-8B-Instruct-2512",
)
print(response.choices[0].text)
from together import Together
client = Together()
speech_file_path = "speech.mp3"
response = client.audio.speech.create(
model="mistralai/Ministral-3-8B-Instruct-2512",
input="Today is a wonderful day to build something people love!",
voice="helpful woman",
)
response.stream_to_file(speech_file_path)
from together import Together
client = Together()
response = client.audio.transcribe(
model="mistralai/Ministral-3-8B-Instruct-2512",
language="en",
response_format="json",
timestamp_granularities="segment"
)
print(response.text)
from together import Together
client = Together()
# Create a video generation job
job = client.videos.create(
prompt="A serene sunset over the ocean with gentle waves",
model="mistralai/Ministral-3-8B-Instruct-2512"
)
from together import Together
client = Together()
job = client.videos.create(
model="mistralai/Ministral-3-8B-Instruct-2512",
frame_images=[
{
"input_image": "https://cdn.pixabay.com/photo/2020/05/20/08/27/cat-5195431_1280.jpg",
}
]
)
import Together from 'together-ai';
const together = new Together();
const completion = await together.chat.completions.create({
model: 'mistralai/Ministral-3-8B-Instruct-2512',
messages: [
{
role: 'user',
content: 'What are some fun things to do in New York?'
}
],
});
console.log(completion.choices[0].message.content);
import Together from "together-ai";
const together = new Together();
async function main() {
const response = await together.images.create({
model: "mistralai/Ministral-3-8B-Instruct-2512",
width: 1024,
height: 1024,
steps: 28,
prompt: "Draw an anime style version of this image.",
image_url: "https://huggingface.co/datasets/patrickvonplaten/random_img/resolve/main/yosemite.png",
});
console.log(response.data[0].url);
}
main();
import Together from "together-ai";
const together = new Together();
const imageUrl = "https://huggingface.co/datasets/patrickvonplaten/random_img/resolve/main/yosemite.png";
async function main() {
const response = await together.chat.completions.create({
model: "mistralai/Ministral-3-8B-Instruct-2512",
messages: [{
role: "user",
content: [
{ type: "text", text: "Describe what you see in this image." },
{ type: "image_url", image_url: { url: imageUrl } }
]
}]
});
console.log(response.choices[0]?.message?.content);
}
main();
import Together from "together-ai";
const together = new Together();
async function main() {
const response = await together.chat.completions.create({
model: "mistralai/Ministral-3-8B-Instruct-2512",
messages: [{
role: "user",
content: "Given two binary strings `a` and `b`, return their sum as a binary string"
}]
});
console.log(response.choices[0]?.message?.content);
}
main();
import Together from "together-ai";
const together = new Together();
const query = "What animals can I find near Peru?";
const documents = [
"The giant panda (Ailuropoda melanoleuca), also known as the panda bear or simply panda, is a bear species endemic to China.",
"The llama is a domesticated South American camelid, widely used as a meat and pack animal by Andean cultures since the pre-Columbian era.",
"The wild Bactrian camel (Camelus ferus) is an endangered species of camel endemic to Northwest China and southwestern Mongolia.",
"The guanaco is a camelid native to South America, closely related to the llama. Guanacos are one of two wild South American camelids; the other species is the vicuña, which lives at higher elevations."
];
async function main() {
const response = await together.rerank.create({
model: "mistralai/Ministral-3-8B-Instruct-2512",
query: query,
documents: documents,
top_n: 2
});
for (const result of response.results) {
console.log(`Relevance Score: ${result.relevance_score}`);
}
}
main();
import Together from "together-ai";
const together = new Together();
const response = await client.embeddings.create({
model: 'mistralai/Ministral-3-8B-Instruct-2512',
input: 'Our solar system orbits the Milky Way galaxy at about 515,000 mph',
});
import Together from "together-ai";
const together = new Together();
async function main() {
const response = await together.completions.create({
model: "meta-llama/Llama-4-Maverick-17B-128E-Instruct-FP8",
prompt: "A horse is a horse",
max_tokens: 32,
temperature: 0.1,
safety_model: "mistralai/Ministral-3-8B-Instruct-2512"
});
console.log(response.choices[0]?.text);
}
main();
import Together from 'together-ai';
const together = new Together();
async function generateAudio() {
const res = await together.audio.create({
input: 'Today is a wonderful day to build something people love!',
voice: 'helpful woman',
response_format: 'mp3',
sample_rate: 44100,
stream: false,
model: 'mistralai/Ministral-3-8B-Instruct-2512',
});
if (res.body) {
console.log(res.body);
const nodeStream = Readable.from(res.body as ReadableStream);
const fileStream = createWriteStream('./speech.mp3');
nodeStream.pipe(fileStream);
}
}
generateAudio();
import Together from "together-ai";
const together = new Together();
const response = await together.audio.transcriptions.create(
model: "mistralai/Ministral-3-8B-Instruct-2512",
language: "en",
response_format: "json",
timestamp_granularities: "segment"
});
console.log(response)
import Together from "together-ai";
const together = new Together();
async function main() {
// Create a video generation job
const job = await together.videos.create({
prompt: "A serene sunset over the ocean with gentle waves",
model: "mistralai/Ministral-3-8B-Instruct-2512"
});
import Together from "together-ai";
const together = new Together();
const job = await together.videos.create({
model: "mistralai/Ministral-3-8B-Instruct-2512",
frame_images: [
{
input_image: "https://cdn.pixabay.com/photo/2020/05/20/08/27/cat-5195431_1280.jpg",
}
]
});
How to use Ministral 3 8B Instruct 2512
Model details
Architecture overview:
• Dense 8.4B parameter language backbone paired with a 0.4B vision encoder for unified text and image IO.
• 256K token context window shared with the rest of the Ministral 3 family for consistent long-context behavior.
• Instruction-tuned head optimized for assistants, agents, and structured outputs such as JSON and tool calls.
Training and performance:
• Trained on diverse multilingual, code, and web-style corpora to provide robust coverage in the 8B tier.
• Instruction tuning emphasizes helpfulness, harmlessness, and adherence to system prompts over raw perplexity.
• Positioned as a mid-size workhorse that delivers near-frontier quality for many assistant and analytic tasks at lower cost and latency.
Prompting Ministral 3 8B Instruct 2512
Applications & Use Cases
Assistants and agents:
• General-purpose chat assistants for support, operations, and knowledge work where responsiveness and quality must balance cost.
• Multimodal internal copilots that combine screenshots, documents, and text queries for debugging, analysis, and investigation.
• Agentic systems that plan, call tools, and synthesize results into natural-language recommendations or summaries.
Product and platform use cases:
• Embedded chat and help widgets inside SaaS products and dashboards.
• RAG-style knowledge interfaces over product docs, knowledge bases, and semi-structured data using the 256K context.
• Content generation, rewriting, translation, and summarization workflows that need solid multilingual quality.
Model Provider:
Mistral AI
Type:
Chat
Variant:
Parameters:
8.8B
Deployment:
Serverless
On-Demand Dedicated
Monthly Reserved
Quantization
FP8
Context length:
256K
Resolution / Duration
Pricing:
Check pricing
Run in playground
Deploy model
Quickstart docs
Quickstart docs
On-Demand Dedicated
Looking for production scale? Deploy on a dedicated endpoint
Deploy Ministral 3 8B Instruct 2512 on a dedicated endpoint with custom hardware configuration, as many instances as you need, and auto-scaling.
