Models / Mistral AI / / Ministral 3 14B Instruct 2512 API
Ministral 3 14B Instruct 2512 API
Frontier 14B multimodal model for high-quality assistants, analytics, and multilingual reasoning.

This model is not currently supported on Together AI.
Visit our Models page to view all the latest models.
Introducing Ministral 3 14B Instruct 2512
Ministral 3 14B Instruct is Mistral AI’s frontier 14B-class multimodal assistant, combining a 13.5B language core with a 0.4B vision encoder for unified text–image reasoning. With a 256K token context window and strong adherence to system prompts, it is built for long-horizon agents, complex chat experiences, and analytical copilots. Released under an Apache 2.0 license, Ministral 3 14B delivers advanced capabilities while remaining fully open and customizable for deep integration.
Ministral 3 14B Instruct 2512 API Usage
Endpoint
curl -X POST "https://api.together.xyz/v1/chat/completions" \
-H "Authorization: Bearer $TOGETHER_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "mistralai/Ministral-3-14B-Instruct-2512",
"messages": [
{
"role": "user",
"content": "What are some fun things to do in New York?"
}
]
}'
curl -X POST "https://api.together.xyz/v1/images/generations" \
-H "Authorization: Bearer $TOGETHER_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "mistralai/Ministral-3-14B-Instruct-2512",
"prompt": "Draw an anime style version of this image.",
"width": 1024,
"height": 768,
"steps": 28,
"n": 1,
"response_format": "url",
"image_url": "https://huggingface.co/datasets/patrickvonplaten/random_img/resolve/main/yosemite.png"
}'
curl -X POST https://api.together.xyz/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $TOGETHER_API_KEY" \
-d '{
"model": "mistralai/Ministral-3-14B-Instruct-2512",
"messages": [{
"role": "user",
"content": [
{"type": "text", "text": "Describe what you see in this image."},
{"type": "image_url", "image_url": {"url": "https://huggingface.co/datasets/patrickvonplaten/random_img/resolve/main/yosemite.png"}}
]
}],
"max_tokens": 512
}'
curl -X POST https://api.together.xyz/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $TOGETHER_API_KEY" \
-d '{
"model": "mistralai/Ministral-3-14B-Instruct-2512",
"messages": [{
"role": "user",
"content": "Given two binary strings `a` and `b`, return their sum as a binary string"
}]
}'
curl -X POST https://api.together.xyz/v1/rerank \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $TOGETHER_API_KEY" \
-d '{
"model": "mistralai/Ministral-3-14B-Instruct-2512",
"query": "What animals can I find near Peru?",
"documents": [
"The giant panda (Ailuropoda melanoleuca), also known as the panda bear or simply panda, is a bear species endemic to China.",
"The llama is a domesticated South American camelid, widely used as a meat and pack animal by Andean cultures since the pre-Columbian era.",
"The wild Bactrian camel (Camelus ferus) is an endangered species of camel endemic to Northwest China and southwestern Mongolia.",
"The guanaco is a camelid native to South America, closely related to the llama. Guanacos are one of two wild South American camelids; the other species is the vicuña, which lives at higher elevations."
],
"top_n": 2
}'
curl -X POST https://api.together.xyz/v1/embeddings \
-H "Authorization: Bearer $TOGETHER_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"input": "Our solar system orbits the Milky Way galaxy at about 515,000 mph.",
"model": "mistralai/Ministral-3-14B-Instruct-2512"
}'
curl -X POST https://api.together.xyz/v1/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $TOGETHER_API_KEY" \
-d '{
"model": "meta-llama/Llama-4-Maverick-17B-128E-Instruct-FP8",
"prompt": "A horse is a horse",
"max_tokens": 32,
"temperature": 0.1,
"safety_model": "mistralai/Ministral-3-14B-Instruct-2512"
}'
curl --location 'https://api.together.ai/v1/audio/generations' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer $TOGETHER_API_KEY' \
--output speech.mp3 \
--data '{
"input": "Today is a wonderful day to build something people love!",
"voice": "helpful woman",
"response_format": "mp3",
"sample_rate": 44100,
"stream": false,
"model": "mistralai/Ministral-3-14B-Instruct-2512"
}'
curl -X POST "https://api.together.xyz/v1/audio/transcriptions" \
-H "Authorization: Bearer $TOGETHER_API_KEY" \
-F "model=mistralai/Ministral-3-14B-Instruct-2512" \
-F "language=en" \
-F "response_format=json" \
-F "timestamp_granularities=segment"
curl --request POST \
--url https://api.together.xyz/v2/videos \
--header "Authorization: Bearer $TOGETHER_API_KEY" \
--header "Content-Type: application/json" \
--data '{
"model": "mistralai/Ministral-3-14B-Instruct-2512",
"prompt": "some penguins building a snowman"
}'
curl --request POST \
--url https://api.together.xyz/v2/videos \
--header "Authorization: Bearer $TOGETHER_API_KEY" \
--header "Content-Type: application/json" \
--data '{
"model": "mistralai/Ministral-3-14B-Instruct-2512",
"frame_images": [{"input_image": "https://cdn.pixabay.com/photo/2020/05/20/08/27/cat-5195431_1280.jpg"}]
}'
from together import Together
client = Together()
response = client.chat.completions.create(
model="mistralai/Ministral-3-14B-Instruct-2512",
messages=[
{
"role": "user",
"content": "What are some fun things to do in New York?"
}
]
)
print(response.choices[0].message.content)
from together import Together
client = Together()
imageCompletion = client.images.generate(
model="mistralai/Ministral-3-14B-Instruct-2512",
width=1024,
height=768,
steps=28,
prompt="Draw an anime style version of this image.",
image_url="https://huggingface.co/datasets/patrickvonplaten/random_img/resolve/main/yosemite.png",
)
print(imageCompletion.data[0].url)
from together import Together
client = Together()
response = client.chat.completions.create(
model="mistralai/Ministral-3-14B-Instruct-2512",
messages=[{
"role": "user",
"content": [
{"type": "text", "text": "Describe what you see in this image."},
{"type": "image_url", "image_url": {"url": "https://huggingface.co/datasets/patrickvonplaten/random_img/resolve/main/yosemite.png"}}
]
}]
)
print(response.choices[0].message.content)
from together import Together
client = Together()
response = client.chat.completions.create(
model="mistralai/Ministral-3-14B-Instruct-2512",
messages=[
{
"role": "user",
"content": "Given two binary strings `a` and `b`, return their sum as a binary string"
}
],
)
print(response.choices[0].message.content)
from together import Together
client = Together()
query = "What animals can I find near Peru?"
documents = [
"The giant panda (Ailuropoda melanoleuca), also known as the panda bear or simply panda, is a bear species endemic to China.",
"The llama is a domesticated South American camelid, widely used as a meat and pack animal by Andean cultures since the pre-Columbian era.",
"The wild Bactrian camel (Camelus ferus) is an endangered species of camel endemic to Northwest China and southwestern Mongolia.",
"The guanaco is a camelid native to South America, closely related to the llama. Guanacos are one of two wild South American camelids; the other species is the vicuña, which lives at higher elevations.",
]
response = client.rerank.create(
model="mistralai/Ministral-3-14B-Instruct-2512",
query=query,
documents=documents,
top_n=2
)
for result in response.results:
print(f"Relevance Score: {result.relevance_score}")
from together import Together
client = Together()
response = client.embeddings.create(
model = "mistralai/Ministral-3-14B-Instruct-2512",
input = "Our solar system orbits the Milky Way galaxy at about 515,000 mph"
)
from together import Together
client = Together()
response = client.completions.create(
model="meta-llama/Llama-4-Maverick-17B-128E-Instruct-FP8",
prompt="A horse is a horse",
max_tokens=32,
temperature=0.1,
safety_model="mistralai/Ministral-3-14B-Instruct-2512",
)
print(response.choices[0].text)
from together import Together
client = Together()
speech_file_path = "speech.mp3"
response = client.audio.speech.create(
model="mistralai/Ministral-3-14B-Instruct-2512",
input="Today is a wonderful day to build something people love!",
voice="helpful woman",
)
response.stream_to_file(speech_file_path)
from together import Together
client = Together()
response = client.audio.transcribe(
model="mistralai/Ministral-3-14B-Instruct-2512",
language="en",
response_format="json",
timestamp_granularities="segment"
)
print(response.text)
from together import Together
client = Together()
# Create a video generation job
job = client.videos.create(
prompt="A serene sunset over the ocean with gentle waves",
model="mistralai/Ministral-3-14B-Instruct-2512"
)
from together import Together
client = Together()
job = client.videos.create(
model="mistralai/Ministral-3-14B-Instruct-2512",
frame_images=[
{
"input_image": "https://cdn.pixabay.com/photo/2020/05/20/08/27/cat-5195431_1280.jpg",
}
]
)
import Together from 'together-ai';
const together = new Together();
const completion = await together.chat.completions.create({
model: 'mistralai/Ministral-3-14B-Instruct-2512',
messages: [
{
role: 'user',
content: 'What are some fun things to do in New York?'
}
],
});
console.log(completion.choices[0].message.content);
import Together from "together-ai";
const together = new Together();
async function main() {
const response = await together.images.create({
model: "mistralai/Ministral-3-14B-Instruct-2512",
width: 1024,
height: 1024,
steps: 28,
prompt: "Draw an anime style version of this image.",
image_url: "https://huggingface.co/datasets/patrickvonplaten/random_img/resolve/main/yosemite.png",
});
console.log(response.data[0].url);
}
main();
import Together from "together-ai";
const together = new Together();
const imageUrl = "https://huggingface.co/datasets/patrickvonplaten/random_img/resolve/main/yosemite.png";
async function main() {
const response = await together.chat.completions.create({
model: "mistralai/Ministral-3-14B-Instruct-2512",
messages: [{
role: "user",
content: [
{ type: "text", text: "Describe what you see in this image." },
{ type: "image_url", image_url: { url: imageUrl } }
]
}]
});
console.log(response.choices[0]?.message?.content);
}
main();
import Together from "together-ai";
const together = new Together();
async function main() {
const response = await together.chat.completions.create({
model: "mistralai/Ministral-3-14B-Instruct-2512",
messages: [{
role: "user",
content: "Given two binary strings `a` and `b`, return their sum as a binary string"
}]
});
console.log(response.choices[0]?.message?.content);
}
main();
import Together from "together-ai";
const together = new Together();
const query = "What animals can I find near Peru?";
const documents = [
"The giant panda (Ailuropoda melanoleuca), also known as the panda bear or simply panda, is a bear species endemic to China.",
"The llama is a domesticated South American camelid, widely used as a meat and pack animal by Andean cultures since the pre-Columbian era.",
"The wild Bactrian camel (Camelus ferus) is an endangered species of camel endemic to Northwest China and southwestern Mongolia.",
"The guanaco is a camelid native to South America, closely related to the llama. Guanacos are one of two wild South American camelids; the other species is the vicuña, which lives at higher elevations."
];
async function main() {
const response = await together.rerank.create({
model: "mistralai/Ministral-3-14B-Instruct-2512",
query: query,
documents: documents,
top_n: 2
});
for (const result of response.results) {
console.log(`Relevance Score: ${result.relevance_score}`);
}
}
main();
import Together from "together-ai";
const together = new Together();
const response = await client.embeddings.create({
model: 'mistralai/Ministral-3-14B-Instruct-2512',
input: 'Our solar system orbits the Milky Way galaxy at about 515,000 mph',
});
import Together from "together-ai";
const together = new Together();
async function main() {
const response = await together.completions.create({
model: "meta-llama/Llama-4-Maverick-17B-128E-Instruct-FP8",
prompt: "A horse is a horse",
max_tokens: 32,
temperature: 0.1,
safety_model: "mistralai/Ministral-3-14B-Instruct-2512"
});
console.log(response.choices[0]?.text);
}
main();
import Together from 'together-ai';
const together = new Together();
async function generateAudio() {
const res = await together.audio.create({
input: 'Today is a wonderful day to build something people love!',
voice: 'helpful woman',
response_format: 'mp3',
sample_rate: 44100,
stream: false,
model: 'mistralai/Ministral-3-14B-Instruct-2512',
});
if (res.body) {
console.log(res.body);
const nodeStream = Readable.from(res.body as ReadableStream);
const fileStream = createWriteStream('./speech.mp3');
nodeStream.pipe(fileStream);
}
}
generateAudio();
import Together from "together-ai";
const together = new Together();
const response = await together.audio.transcriptions.create(
model: "mistralai/Ministral-3-14B-Instruct-2512",
language: "en",
response_format: "json",
timestamp_granularities: "segment"
});
console.log(response)
import Together from "together-ai";
const together = new Together();
async function main() {
// Create a video generation job
const job = await together.videos.create({
prompt: "A serene sunset over the ocean with gentle waves",
model: "mistralai/Ministral-3-14B-Instruct-2512"
});
import Together from "together-ai";
const together = new Together();
const job = await together.videos.create({
model: "mistralai/Ministral-3-14B-Instruct-2512",
frame_images: [
{
input_image: "https://cdn.pixabay.com/photo/2020/05/20/08/27/cat-5195431_1280.jpg",
}
]
});
How to use Ministral 3 14B Instruct 2512
Model details
Architecture overview:
• Dense 14B-class language model paired with a lightweight vision encoder, exposed as a single interface for text and images.
• 256K token context window designed for extended conversations, document analysis, and long-running tool-call traces.
• Instruction-tuned head focused on stable assistant behavior, schema-following, and controllable output formatting for agents and workflows.
Training and performance:
• Trained on diverse multilingual, code, and web-style data to cover a wide range of reasoning and analytic tasks.
• Instruct variant optimized for dialogue, tool use, and structured outputs rather than raw pretraining perplexity.
• Positioned to compete with much larger models on many assistant, coding, and reasoning tasks while keeping latency and cost more manageable.
Prompting Ministral 3 14B Instruct 2512
Applications & Use Cases
Assistants and agents:
• High-end multilingual assistants for support, operations, and knowledge work that require grounded, explainable responses.
• Internal copilots that coordinate retrieval, tools, and business logic to automate complex, multi-step workflows.
• Agentic systems that plan, call tools, and synthesize results into concise recommendations or actions.
Knowledge, content, and multimodal workflows:
• Long-document analysis, summarization, and synthesis across technical docs, contracts, product specs, and knowledge bases using the 256K context window.
• Multimodal understanding of screenshots, diagrams, and document snippets for debugging, troubleshooting, and guided workflows.
• High-quality content drafting, editing, and transformation across many languages, including structured reports, specifications, and templated outputs.
Model Provider:
Mistral AI
Type:
Chat
Variant:
Parameters:
13.9B
Deployment:
Serverless
On-Demand Dedicated
Monthly Reserved
Quantization
FP8
Context length:
256K
Resolution / Duration
Pricing:
Check pricing
Run in playground
Deploy model
Quickstart docs
Quickstart docs
Serverless
On-Demand Dedicated
Monthly Reserved
Looking for production scale? Deploy on a dedicated endpoint
Deploy Ministral 3 14B Instruct 2512 on a dedicated endpoint with custom hardware configuration, as many instances as you need, and auto-scaling.
