About LLaVA v1.6 34B
LLaVA Yi 34B is an open-source model trained by fine-tuning LLM on multimodal instruction-following data. It is an auto-regressive language model, based on the transformer architecture. Base LLM: [NousResearch/Nous-Hermes-2-Yi-34B](/models/nousresearch/nous-hermes-yi-34b)
It was trained in December 2023.
Specifications
- Provider
- Liuhaotian
- Context Length
- 4,096 tokens
- Input Types
- text, image
- Output Types
- text
- Category
- Yi
- Added
- 5/11/2024