About Magistral Small 2506
Magistral Small is a 24B parameter instruction-tuned model based on Mistral-Small-3.1 (2503), enhanced through supervised fine-tuning on traces from Magistral Medium and further refined via reinforcement learning. It is optimized for reasoning and supports a wide multilingual range, including over 20 languages.
Specifications
- Provider
- Mistral AI
- Context Length
- 40,000 tokens
- Input Types
- text
- Output Types
- text
- Category
- Mistral
- Added
- 6/10/2025