About Ministral 3B
Ministral 3B is a 3B parameter model optimized for on-device and edge computing. It excels in knowledge, commonsense reasoning, and function-calling, outperforming larger models like Mistral 7B on most benchmarks. Supporting up to 128k context length, it’s ideal for orchestrating agentic workflows and specialist tasks with efficient inference.
Specifications
- Provider
- Mistral AI
- Context Length
- 131,072 tokens
- Input Types
- text
- Output Types
- text
- Category
- Mistral
- Added
- 10/17/2024