Meet AntAngelMed: A 103B-Parameter Open-Source Medical Language Model Built on a 1/32 Activation-Ratio MoE Architecture

AimostAll news brief curated from MarkTechPost.

Source details

Original source
MarkTechPost
Published
2026-05-12
Primary topic
Foundation Models

Why it matters

Model launches, benchmark jumps, API upgrades, context window changes, and frontier LLM competition. Use the original source for the full report, then use the directory shortcuts below to compare the products and workflows the story points toward.

What happened

MedAIBase has released AntAngelMed, a 103B-parameter open-source medical language model that uses a 1/32 activation-ratio Mixture-of-Experts (MoE) architecture to activate only 6.1B parameters at inference time, matching the performance of roughly 40B dense models while exceeding 200 tokens per second on H20 hardware. Built on Ling-flash-2.0 and trained through a three-stage pipeline of continual pre-training, supervised fine-tuning, and GRPO-based reinforcement learning, the model ranks first among open-source models on OpenAI's HealthBench and tops both MedAIBench and MedBench leaderboards. The post Meet AntAngelMed: A 103B-Parameter Open-Source Medical Language Model Built on a 1/32 Activation-Ratio MoE Architecture appeared first on MarkTechPost .

What to do next

Compare the hosted model pages first, then check the related tools and buyer guides before changing workflow standards.

MedAIBase has released AntAngelMed, a 103B-parameter open-source medical language model that uses a 1/32 activation-ratio Mixture-of-Experts (MoE) architecture to activate only 6.1B parameters at inference time, matching the performance of roughly 40B dense models while exceeding 200 tokens per second on H20 hardware. Built on Ling-flash-2.0 and trained through a three-stage pipeline of continual pre-training, supervised fine-tuning, and GRPO-based reinforcement learning, the model ranks first among open-source models on OpenAI's HealthBench and tops both MedAIBench and MedBench leaderboards. The post Meet AntAngelMed: A 103B-Parameter Open-Source Medical Language Model Built on a 1/32 Activation-Ratio MoE Architecture appeared first on MarkTechPost .

This AimostAll brief summarizes the linked source so readers can scan AI developments quickly and jump to the original reporting when needed.

Read original source More models news OpenAI page

Directory context

Tools, models, and guides to go deeper

Move from the headline to product evaluation with topic-matched tool pages, model references, and buyer guides.

Related coverage

More from this topic