Moonshot AI Unveils Kimi K2: A 32B Parameter MoE Language Model with Powerful Agentic Capabilities

2025-07-13
Moonshot AI Unveils Kimi K2: A 32B Parameter MoE Language Model with Powerful Agentic Capabilities

Moonshot AI has released Kimi K2, a state-of-the-art 32 billion parameter Mixture-of-Experts (MoE) language model boasting a total of 1 trillion parameters. Trained using the Muon optimizer, Kimi K2 excels in frontier knowledge, reasoning, and coding tasks, and is meticulously optimized for agentic capabilities. It comes in two versions: Kimi-K2-Base, a foundation model for researchers, and Kimi-K2-Instruct, a ready-to-use instruction-following model with robust tool-calling capabilities, autonomously deciding when and how to use tools. The model and its weights are open-sourced, and an API is available.