WritingmateWritingmate
AI Model

Zephyr 141B-A35B

Huggingfaceh4
Text Generation
About Zephyr 141B-A35B

Zephyr 141B-A35B is A Mixture of Experts (MoE) model with 141B total parameters and 35B active parameters. Fine-tuned on a mix of publicly available, synthetic datasets.

It is an instruct finetune of [Mixtral 8x22B](/models/mistralai/mixtral-8x22b).

#moe

Specifications
Provider
Huggingfaceh4
Context Length
65,536 tokens
Input Types
text
Output Types
text
Category
Mistral
Added
4/12/2024

Use Zephyr 141B-A35B and 200+ more models

Access all the best AI models in one platform. No API keys, no switching between apps.