About Zephyr 141B-A35B
Zephyr 141B-A35B is A Mixture of Experts (MoE) model with 141B total parameters and 35B active parameters. Fine-tuned on a mix of publicly available, synthetic datasets.
It is an instruct finetune of [Mixtral 8x22B](/models/mistralai/mixtral-8x22b).
#moe
Specifications
- Provider
- Huggingfaceh4
- Context Length
- 65,536 tokens
- Input Types
- text
- Output Types
- text
- Category
- Mistral
- Added
- 4/12/2024