About Mixtral 8x7B Instruct
Mixtral 8x7B Instruct is a pretrained generative Sparse Mixture of Experts, by Mistral AI, for chat and instruction use. Incorporates 8 experts (feed-forward networks) for a total of 47 billion parameters.
Instruct model fine-tuned by Mistral. #moe
Specifications
- Provider
- Mistral AI
- Context Length
- 32,768 tokens
- Input Types
- text
- Output Types
- text
- Category
- Mistral
- Added
- 12/10/2023