Gemma 4 26B A4B (free)
Gemma 4 26B A4B (free)
All models
Gemma 4 26B A4B IT is an instruction-tuned Mixture-of-Experts (MoE) model developed by Google DeepMind. It features 25.2 billion total parameters, with only 3.8 billion activated per token during inference, enabling high-quality outputs comparable to models with 31 billion parameters while optimizing compute efficiency. The model supports multimodal inputs, including text, images, and video, and offers advanced capabilities such as a 256K token context window, native function calling, configurable reasoning modes, and structured output generation.
Related content
Data enriched Apr 24, 2026. Pricing from OpenRouter API.