FLUX.2 is live! High-fidelity image generation made simple.

zai-org/
$0.43
in
$1.75
out
$0.08
cached
/ 1M tokens
GLM-4.7 is a state-of-the-art, multilingual Mixture-of-Experts (MoE) language model designed for complex reasoning, agentic coding, and tool use. Building on its predecessor GLM-4.6, it delivers significant improvements across key benchmarks, including multilingual SWE-bench, Terminal Bench, and reasoning-heavy evaluations like HLE. The model features advanced "Interleaved Thinking" and new "Preserved Thinking" modes, allowing it to reason before actions and maintain consistency across long, multi-turn tasks. With 358 billion parameters, GLM-4.7 excels in generating clean code, modern UI elements, and sophisticated reasoning outputs.

Ask me anything
Settings
© 2025 Deep Infra. All rights reserved.