About Capybara 34B
This model is trained on the Yi-34B model for 3 epochs on the Capybara dataset. It's the first 34B Nous model and first 200K context length Nous model.
Specifications
- Provider
- Nousresearch
- Context Length
- 200,000 tokens
- Input Types
- text
- Output Types
- text
- Category
- Llama2
- Added
- 11/15/2023