Mistral: Mixtral 8x22B Instruct
mistralai
Mistral's official instruct fine-tuned version of [Mixtral 8x22B](/models/mistralai/mixtral-8x22b). It uses 39B active parameters out of 141B, offering unparalleled cost efficiency for its size. Its strengths include: - strong math, coding,...
- Context window
- 65,536 tokens
- Input cost
- $2.00 / 1M
- Output cost
- $6.00 / 1M
- Latency (p50)
- —
