Arcee AI AFM-4.5B
4.5B-parameter foundation model trained on 6.58T curated tokens, achieving 200+ CPU tokens/sec with Western compliance standards, outperforming Qwen3-4B and Gemma3-4B across benchmarks.
This model is not available on Together’s Serverless API.
Pick a supported alternative from the Model Library.
Related models
- Model providerArcee AI
- TypeLLMChat
- Main use casesChat
- Parameters4.6B
- Context length65k
- External link
- CategoryChat