Baidu's Ernie 5.1 cuts 94 percent of pre-training costs while competing with top models

AimostAll news brief curated from The Decoder.

Source details

Original source
The Decoder
Published
2026-05-11
Primary topic
Foundation Models

Why it matters

Model launches, benchmark jumps, API upgrades, context window changes, and frontier LLM competition. Use the original source for the full report, then use the directory shortcuts below to compare the products and workflows the story points toward.

What happened

Baidu's Ernie 5.1 uses just a third of its predecessor's parameters and reportedly cost only six percent of what comparable models require to pre-train. That's possible thanks to a "Once-For-All" approach that extracts smaller sub-models from a single training run. On the Search Arena leaderboard, Ernie 5.1 ranks 4th globally, behind two Claude Opus variants and GPT-5.5 Search. The article Baidu's Ernie 5.1 cuts 94 percent of pre-training costs while competing with top models appeared first on The Decoder .

What to do next

Compare the hosted model pages first, then check the related tools and buyer guides before changing workflow standards.

Baidu's Ernie 5.1 uses just a third of its predecessor's parameters and reportedly cost only six percent of what comparable models require to pre-train. That's possible thanks to a "Once-For-All" approach that extracts smaller sub-models from a single training run. On the Search Arena leaderboard, Ernie 5.1 ranks 4th globally, behind two Claude Opus variants and GPT-5.5 Search. The article Baidu's Ernie 5.1 cuts 94 percent of pre-training costs while competing with top models appeared first on The Decoder .

This AimostAll brief summarizes the linked source so readers can scan AI developments quickly and jump to the original reporting when needed.

Read original source More models news Anthropic page

Directory context

Tools, models, and guides to go deeper

Move from the headline to product evaluation with topic-matched tool pages, model references, and buyer guides.

Related coverage

More from this topic