WritingmateWritingmate
AI Model

Mistral: Mixtral 8x22B (base)

Mistral: Mixtral 8x22B (base) logoMistral AI
Text Generation
About Mixtral 8x22B (base)

Mixtral 8x22B is a large-scale language model from Mistral AI. It consists of 8 experts, each 22 billion parameters, with each token using 2 experts at a time.

It was released via [X](https://twitter.com/MistralAI/status/1777869263778291896).

#moe

Specifications
Provider
Mistral AI
Context Length
65,536 tokens
Input Types
text
Output Types
text
Category
Mistral
Added
4/10/2024

Use Mixtral 8x22B (base) and 200+ more models

Access all the best AI models in one platform. No API keys, no switching between apps.