About GLM 4 9B
GLM-4-9B-0414 is a 9 billion parameter language model from the GLM-4 series developed by THUDM. Trained using the same reinforcement learning and alignment strategies as its larger 32B counterparts, GLM-4-9B-0414 achieves high performance relative to its size, making it suitable for resource-constrained deployments that still require robust language understanding and generation capabilities.
Specifications
- Provider
- Thudm
- Context Length
- 32,000 tokens
- Input Types
- text
- Output Types
- text
- Category
- Other
- Added
- 4/25/2025