LLaMA-2
LLM trained on 2T tokens with double Llama 1's context length, available in 7B, 13B, and 70B parameter sizes.
This model is not available on Together’s Serverless API.
Pick a supported alternative from the Model Library.
Related models
- TypeLLMChat
- Main use casesChatMedium General Purpose
- Parameters69B
- Context length4K
- External link
- CategoryChat