WritingmateWritingmate
AI Model

Mistral: Mixtral 8x7B Instruct

Mistral: Mixtral 8x7B Instruct logoMistral AI
Text Generation
About Mixtral 8x7B Instruct

Mixtral 8x7B Instruct is a pretrained generative Sparse Mixture of Experts, by Mistral AI, for chat and instruction use. Incorporates 8 experts (feed-forward networks) for a total of 47 billion parameters.

Instruct model fine-tuned by Mistral. #moe

Specifications
Provider
Mistral AI
Context Length
32,768 tokens
Input Types
text
Output Types
text
Category
Mistral
Added
12/10/2023

Use Mixtral 8x7B Instruct and 200+ more models

Access all the best AI models in one platform. No API keys, no switching between apps.