Latest News and Trends

ByteDance open-sources COMET to boost MoE efficiency, accelerating LLM training by 1.7x

ByteDance’s Doubao AI team has open-sourced COMET, a Mixture of Experts (MoE) optimization framework that improves large language model (LLM) training efficiency while reducing costs. Already integrated into ByteDance’s 10,000+ GPU clusters, COMET has saved millions of GPU compute hours.