What happened
Xiaomi's MiMo team revealed MiMo-V2-Pro on March 18, 2026 — a 1-trillion parameter Mixture-of-Experts foundation model with 42 billion active parameters per inference. A week earlier, an anonymous model codenamed "Hunter Alpha" appeared on OpenRouter with no press release or developer attribution. It topped the daily usage charts for multiple days and processed over 1 trillion tokens before Xiaomi confirmed it was an early build of MiMo-V2-Pro. The model uses a 7:1 hybrid attention ratio, supports 1M-token context, and is optimized as the native brain for OpenClaw agent frameworks. On SWE-bench Verified, it reportedly outperforms Claude Sonnet 4.6, and its general agent performance on ClawEval approaches Opus 4.6. The model is currently free to use on the MiMo platform and available on OpenRouter at $0.30 per million tokens.
Why it matters
The stealth launch strategy — letting a model prove itself anonymously before claiming credit — is a new pattern in the model release playbook, and it worked: Hunter Alpha accumulated genuine usage data and developer credibility before the reveal. More substantively, MiMo-V2-Pro signals that Chinese AI labs are shipping competitive trillion-parameter agent-focused models at extremely aggressive price points. The model's explicit optimization for agentic workloads (tool calling, multi-step task completion, OpenClaw integration) reflects the industry's shift from chat-first to agent-first model design.
Who should pay attention
- Developers building AI agents who want a cost-effective frontier model
- Teams evaluating alternatives to Anthropic and OpenAI for coding tasks
- Anyone tracking the competitive dynamics between Chinese and Western AI labs