About Jamba 1.5 Large
Jamba 1.5 Large is part of AI21's new family of open models, offering superior speed, efficiency, and quality.
It features a 256K effective context window, the longest among open models, enabling improved performance on tasks like document summarization and analysis.
Built on a novel SSM-Transformer architecture, it outperforms larger models like Llama 3.1 70B on benchmarks while maintaining resource efficiency.
Read their [announcement](https://www.ai21.com/blog/announcing-jamba-model-family) to learn more.
Specifications
- Provider
- AI21 Labs
- Context Length
- 256,000 tokens
- Input Types
- text
- Output Types
- text
- Category
- Other
- Added
- 8/23/2024
Frequently Asked Questions
Common questions about Jamba 1.5 Large