Mixtral 8x7B Instruct v0.1
Pretrained generative Sparse Mixture of Experts.
About model
Mixtral 8x7B Instruct v0.1 generates human-like text based on input prompts, excelling at tasks that require understanding and responding to instructions, making it suitable for developers and researchers working with large language models.
API usage
Endpoint:
Related models
- TypeLLMChat
- Main use casesChatMedium General Purpose
- Fine tuningSupported
- DeploymentServerlessOn-Demand DedicatedMonthly Reserved
- Parameters46.7B
- Context length32K
- Input price
$0.60 / 1M tokens
- Output price
$0.60 / 1M tokens
- Input modalitiesText
- Output modalitiesText
- ReleasedDecember 10, 2023
- Quantization levelFP16
- External link
- CategoryChat