WritingmateWritingmate
AI Model

Noromaid Mixtral 8x7B Instruct

Neversleep
Text Generation
About Noromaid Mixtral 8x7B Instruct

This model was trained for 8h(v1) + 8h(v2) + 12h(v3) on customized modified datasets, focusing on RP, uncensoring, and a modified version of the Alpaca prompting (that was already used in LimaRP), which should be at the same conversational level as ChatLM or Llama2-Chat without adding any additional special tokens.

Specifications
Provider
Neversleep
Context Length
8,000 tokens
Input Types
text
Output Types
text
Category
Mistral
Added
1/2/2024

Use Noromaid Mixtral 8x7B Instruct and 200+ more models

Access all the best AI models in one platform. No API keys, no switching between apps.