About Mixtral 8x22B Instruct
Mistral's official instruct fine-tuned version of [Mixtral 8x22B](/models/mistralai/mixtral-8x22b). It uses 39B active parameters out of 141B, offering unparalleled cost efficiency for its size. Its strengths include: - strong math, coding, and reasoning - large context length (64k) - fluency in English, French, Italian, German, and Spanish
See benchmarks on the launch announcement [here](https://mistral.ai/news/mixtral-8x22b/). #moe
Specifications
- Provider
- Mistral AI
- Context Length
- 65,536 tokens
- Input Types
- text
- Output Types
- text
- Category
- Mistral
- Added
- 4/17/2024