WritingmateWritingmate
AI Model

Dolphin 2.6 Mixtral 8x7B 🐬

Cognitivecomputations
Text Generation
About Dolphin 2.6 Mixtral 8x7B 🐬

This is a 16k context fine-tune of [Mixtral-8x7b](/models/mistralai/mixtral-8x7b). It excels in coding tasks due to extensive training with coding data and is known for its obedience, although it lacks DPO tuning.

The model is uncensored and is stripped of alignment and bias. It requires an external alignment layer for ethical use. Users are cautioned to use this highly compliant model responsibly, as detailed in a blog post about uncensored models at [erichartford.com/uncensored-models](https://erichartford.com/uncensored-models).

#moe #uncensored

Specifications
Provider
Cognitivecomputations
Context Length
32,768 tokens
Input Types
text
Output Types
text
Category
Mistral
Added
12/21/2023

Use Dolphin 2.6 Mixtral 8x7B 🐬 and 200+ more models

Access all the best AI models in one platform. No API keys, no switching between apps.