Baidu Releases ERNIE 5.1 with One-Third the Parameters and 6% of Comparable Training Cost
Tags AI ยท Infrastructure
Baidu released ERNIE 5.1, compressing total parameters to approximately one-third and active parameters to half of ERNIE 5.0 while achieving leading performance at its model scale using only 6% of the pre-training cost of comparable models. The model ranked 4th globally and 1st among Chinese models on the Arena Search leaderboard with a score of 1,223, and 14th on the LMArena text leaderboard โ the highest position any Chinese lab has held. Built on a Once-for-All elastic training framework with disaggregated asynchronous reinforcement learning infrastructure, ERNIE 5.1 is priced at $0.59/1M input tokens and $2.65/1M output tokens via Baidu Qianfan.
Technical significance
ERNIE 5.1 demonstrates that Chinese labs are competing on parameter efficiency, not just scale. Combined with DeepSeek's recent releases, it signals that the cost frontier for training capable models is being pushed down rapidly, which has implications for who can compete in foundation model development and the sustainability of the compute arms race.