A joint team from BIGAI and Unitree has unveiled OmniXtreme, a unified control policy that enables humanoid robots to perform diverse extreme motions—from backflips to breakdancing—without the need for task-specific overfitting.
Physical Intelligence (Pi) has unveiled Multi-Scale Embodied Memory (MEM), a hybrid architecture that combines short-term video encoding with long-term textual summarization to help robots master long-horizon tasks like kitchen cleaning and in-context error recovery.
NVIDIA researchers have revealed EgoScale, a framework that leverages a massive 20,854-hour egocentric human dataset to train robots in complex, fine-grained manipulation with minimal robot-in-the-loop data.
NVIDIA has released SONIC, a generalist humanoid controller trained on 100 million frames of motion data, aiming to replace manual reward engineering with a scalable "System 1" foundation for whole-body movement.
A new full-stack framework from HKUST and Shanghai AI Lab allows humanoid robots to acquire complex athletic skills like basketball and reactive fighting directly from human videos—no manual reward engineering required.