Models / Chat / Arcee AI Blitz API
Arcee AI Blitz API
LLM
Efficient 24B SLM with strong world knowledge, offering fast, affordable performance across diverse tasks.
Try our Arcee AI API
API Usage
How to use Arcee AI BlitzModel CardPrompting Arcee AI BlitzApplications & Use CasesHow to use Arcee AI BlitzArcee AI Blitz API Usage
Endpoint
arcee-ai/arcee-blitz
RUN INFERENCE
curl -X POST "https://api.together.xyz/v1/chat/completions" \
-H "Authorization: Bearer $TOGETHER_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "arcee-ai/arcee-blitz",
"messages": [],
"stream": true
}'
JSON RESPONSE
RUN INFERENCE
from together import Together
client = Together()
response = client.chat.completions.create(
model="arcee-ai/arcee-blitz",
messages=[],
stream=True
)
for token in response:
if hasattr(token, 'choices'):
print(token.choices[0].delta.content, end='', flush=True)
JSON RESPONSE
RUN INFERENCE
import Together from "together-ai";
const together = new Together();
const response = await together.chat.completions.create({
messages: [],
model: "arcee-ai/arcee-blitz",
stream: true
});
for await (const token of response) {
console.log(token.choices[0]?.delta?.content)
}
JSON RESPONSE
Model Provider:
Arcee AI
Type:
Chat
Variant:
Parameters:
23.6B
Deployment:
✔ Serverless
✔️ On-Demand Dedicated
Quantization
Context length:
32k
Pricing:
$0.45 input / $0.75 output
Check pricing
Run in playground
Deploy model
Quickstart docs
Quickstart docs
How to use Arcee AI Blitz
Model details
Prompting Arcee AI Blitz
Applications & Use Cases
How to use Arcee AI Blitz
Looking for production scale? Deploy on a dedicated endpoint
Deploy Arcee AI Blitz on a dedicated endpoint with custom hardware configuration, as many instances as you need, and auto-scaling.
