About Ministral 8B
Ministral 8B is an 8B parameter model featuring a unique interleaved sliding-window attention pattern for faster, memory-efficient inference. Designed for edge use cases, it supports up to 128k context length and excels in knowledge and reasoning tasks. It outperforms peers in the sub-10B category, making it perfect for low-latency, privacy-first applications.
Specifications
- Provider
- Mistral AI
- Context Length
- 131,072 tokens
- Input Types
- text
- Output Types
- text
- Category
- Mistral
- Added
- 10/17/2024