Compared with GLM-4.5, this generation brings several key improvements: Longer context window: The context window has been expanded from 128K to 200K tokens, enabling the model to handle more complex.
function_calling| Model | Context | Max out | Pricing |
|---|---|---|---|
| Z.ai: GLM 4.6V | 131K | 24K | $0.3$0.9 |
| Z.ai: GLM 4.6 | 203K | 131K | $0.43$1.74 |
/v1/models/openrouter/z-ai/glm-4.6Compared with GLM-4.5, this generation brings several key improvements: Longer context window: The context window has been expanded from 128K to 200K tokens, enabling the model to handle more complex.