modelstop.top

Compare Models

Run side-by-side checks for pricing, context window, and latency.

Baidu: ERNIE 4.5 21B A3B

baidu

A sophisticated text-based Mixture-of-Experts (MoE) model featuring 21B total parameters with 3B activated per token, delivering exceptional multimodal understanding and generation through heterogeneous MoE structures and modality-isolated routing. Supporting an...

Context window
120,000 tokens
Input cost
$0.07 / 1M
Output cost
$0.28 / 1M
Latency (p50)