About Noromaid Mixtral 8x7B Instruct
This model was trained for 8h(v1) + 8h(v2) + 12h(v3) on customized modified datasets, focusing on RP, uncensoring, and a modified version of the Alpaca prompting (that was already used in LimaRP), which should be at the same conversational level as ChatLM or Llama2-Chat without adding any additional special tokens.
Specifications
- Provider
- Neversleep
- Context Length
- 8,000 tokens
- Input Types
- text
- Output Types
- text
- Category
- Mistral
- Added
- 1/2/2024