modelstop.top
Back to models
z-aimodel

Z.ai: GLM 4.6

Compared with GLM-4.5, this generation brings several key improvements: Longer context window: The context window has been expanded from 128K to 200K tokens, enabling the model to handle more complex...

Best for

Bulk Data ExtractionHigh-Volume TasksLong DocumentsBook Summarisation
Context Window
205K tokens ≈ 455 pages of text
Input Cost
$0.39/1M
Output Cost
$1.90/1M
Latency p50

Pricing Details

Standard Pricing
Input (per 1M tokens)
$0.39
Output (per 1M tokens)
$1.90

Hallucination Score™ (est.)

Community reliability estimate · not official

Not yet rated

About this score: Community-estimated based on user reports and publicly available benchmark data (e.g. TruthfulQA). This is not an official score from the model provider. Scores may be inaccurate — always verify with the official leaderboard before making production decisions.

Price History

Not enough historical data yet. Check back after the next pricing sync.

Provider

z-ai

Community Prompts

Proven prompts shared by the community for this model

Loading prompts…