Baidu Releases ERNIE 5.1 with Pre-Training Cost at Just 6% of Comparable Models
Tags AI · Enterprise

Baidu released ERNIE 5.1, a new flagship AI model that compresses total parameters to one-third and active parameters to one-half of ERNIE 5.0, achieving frontier-level performance with only 6% of the pre-training compute cost of comparable models. ERNIE 5.1 scores 1,223 on the Arena Search leaderboard, ranking 4th globally and 1st among Chinese models. The model surpasses DeepSeek-V4-Pro in agent capabilities and has creative writing and reasoning comparable to top-tier international closed-source models. Baidu used an 'Once-for-All' elastic training framework that extracts the optimal sub-network from ERNIE 5.0's sub-model matrix rather than training from scratch. The model is available on Baidu's Qianfan model platform and its official website for enterprises and developers.
Technical significance
ERNIE 5.1's 6% pre-training cost claim, if independently verified, represents a significant leap in training efficiency that could reshape the competitive dynamics of foundation model development. The 'Once-for-All' elastic training approach — extracting optimal sub-networks from a larger model matrix — is an alternative to the scaling law paradigm that could reduce the capital required to train competitive models. Baidu's AI Developer Conference on May 13-14 is expected to reveal further details.