Qwen3 Coder 30B A3B Instruct
Model Overview¶
Qwen3-Coder-30B-A3B-Instruct model maintains impressive performance and efficiency, featuring the following key enhancements:
- Significant Performance among open models on Agentic Coding, Agentic Browser-Use, and other foundational coding tasks.
- Long-context Capabilities with native support for 256K tokens, extendable up to 1M tokens using Yarn, optimized for repository-scale understanding.
- Agentic Coding supporting for most platform such as Qwen Code, CLINE, featuring a specially designed function call format.
Model Architecture¶
- Type: Causal Language Model (CLM)
- Number of Parameters: 30.5B in total and 3.3B activated
- Number of Layers: 48
- Number of Attention Heads (GQA): 32 for Q and 4 for KV
- Number of Experts: 128
- Number of Activated Experts: 8
- Context Length: 262,144 natively.
- Model Source: Qwen/Qwen3-Coder-30B-A3B-Instruct
- License: apache-2.0
QPC Configurations¶
| Precision | SoCs / Tensor slicing | NSP-Cores (per SoC) | Full Batch Size | Chunking Prompt Length | Context Length (CL) | QPC URL | QPC Size | QPC Download | Onnx URL | Onnx Download | Generation Date |
|---|---|---|---|---|---|---|---|---|---|---|---|
| MXFP6 | 4 | 16 | 1 | 128 | 32768 | https://dc00tk1pxen80.cloudfront.net/SDK1.21.4.0/Qwen/Qwen3-Coder-30B-A3B-Instruct/Qwen3_Coder_30B_A3B_Instruct_qpc_16cores_1bs_[2048,4096,8192,12288,16384,24576,32768]ccl_4devices_mxfp6_mxint8.tar.gz | 51GB | Download | http://qualcom-qpc-models.s3-website-us-east-1.amazonaws.com/SDK1.21.4.0/Qwen/Qwen3-Coder-30B-A3B-Instruct/Qwen3_Coder_30B_A3B_Instruct_ONNX.tar.gz | Download | 5-May-2026 |