About DeepSeek R1T Chimera (free)
DeepSeek-R1T-Chimera is created by merging DeepSeek-R1 and DeepSeek-V3 (0324), combining the reasoning capabilities of R1 with the token efficiency improvements of V3. It is based on a DeepSeek-MoE Transformer architecture and is optimized for general text generation tasks.
The model merges pretrained weights from both source models to balance performance across reasoning, efficiency, and instruction-following tasks. It is released under the MIT license and intended for research and commercial use.
Specifications
- Provider
- Tngtech
- Context Length
- 163,840 tokens
- Input Types
- text
- Output Types
- text
- Category
- DeepSeek
- Added
- 4/27/2025