About Jamba Large 1.7
Jamba Large 1.7 is the latest model in the Jamba open family, offering improvements in grounding, instruction-following, and overall efficiency. Built on a hybrid SSM-Transformer architecture with a 256K context window, it delivers more accurate, contextually grounded responses and better steerability than previous versions.
Specifications
- Provider
- AI21 Labs
- Context Length
- 256,000 tokens
- Input Types
- text
- Output Types
- text
- Category
- Other
- Added
- 8/8/2025
Frequently Asked Questions
Common questions about Jamba Large 1.7