AI Model

Qwen: Qwen3 235B A22B Instruct 2507

Qwen: Qwen3 235B A22B Instruct 2507 logoQwen (Alibaba)
Text Generation
About Qwen3 235B A22B Instruct 2507

Qwen3-235B-A22B-Instruct-2507 is a multilingual, instruction-tuned mixture-of-experts language model based on the Qwen3-235B architecture, with 22B active parameters per forward pass. It is optimized for general-purpose text generation, including instruction following, logical reasoning, math, code, and tool usage. The model supports a native 262K context length and does not implement "thinking mode" (<think> blocks).

Compared to its base variant, this version delivers significant gains in knowledge coverage, long-context reasoning, coding benchmarks, and alignment with open-ended tasks. It is particularly strong on multilingual understanding, math reasoning (e.g., AIME, HMMT), and alignment evaluations like Arena-Hard and WritingBench.

Specifications
Provider
Qwen (Alibaba)
Context Length
262,144 tokens
Input Types
text
Output Types
text
Category
Qwen3
Added
7/21/2025

Frequently Asked Questions

Common questions about Qwen3 235B A22B Instruct 2507

Use Qwen3 235B A22B Instruct 2507 and 200+ more models

Access all the best AI models in one platform. No API keys, no switching between apps.