Published on

The Puppet Strings of AI: MIT’s Ultrasound Wristband Predicts Hand Dexterity

Humanoids Daily
Written byHumanoids Daily

The human hand is a mechanical marvel, coordinated by 34 muscles and over 100 tendons and ligaments. For the robotics industry, replicating this dexterity has remained a "final boss" challenge, often requiring bulky hardware or specialized environments. Today, a team of engineers from MIT and the University of Southern California revealed a potential shortcut: a wearable ultrasound wristband that looks through the skin to predict hand movements in real time.

Published in Nature Electronics, the research describes a device about the size of a smartwatch that uses an ultrasound sticker to continuously image the "strings" of the wrist—the muscles and tendons that control finger movement. When paired with an AI algorithm, the system translates these internal biological shifts into the 22 degrees of freedom (DoF) required to map the hand’s position in a digital or robotic space.

An MIT graduate student wearing an ultrasound wristband performs a hand gesture, which is mirrored in real-time by a white robotic hand mounted on a stand in the foreground.
Real-time teleoperation: An MIT researcher demonstrates the ability of the ultrasound wristband to wirelessly control a robotic hand by tracking internal muscle and tendon movements.

Seeing the "Strings"

Traditional methods for capturing hand data usually fall into three categories: vision-based cameras, sensor-laden gloves, or electromyography (EMG). Each has significant drawbacks. Cameras suffer from occlusion, where a hand disappears if blocked from view. Sensor gloves, like the Manus Metagloves Pro Haptic, provide high precision but can be cumbersome or limit natural tactile sensation. Meanwhile, EMG—which measures electrical signals in muscles—is notoriously susceptible to environmental noise and often lacks the sensitivity to track subtle, continuous motions.

The MIT team, led by Professor Xuanhe Zhao, opted for ultrasound imaging because it provides a high-resolution view of the mechanical state of the arm. "The tendons and muscles in your wrist are like strings pulling on puppets," explained researcher Gengxi Lu. By taking a real-time picture of these "strings," the AI can infer exactly how the fingers are positioned.

A close-up view of the prototype MIT ultrasound wristband strapped to a forearm, featuring a transparent casing that reveals circuit boards, a battery, and glowing blue LED lights.
The prototype hardware features a smartwatch-sized ultrasound sticker paired with compact onboard electronics roughly the size of a smartphone to image the internal "strings" of the wrist.

To train the system, volunteers wore the wristband while their hands were monitored by a multi-camera setup. The AI learned to correlate specific patterns in the black-and-white ultrasound images with the "ground truth" hand positions captured by the cameras. Once trained, the wristband proved capable of predicting gestures for users with diverse hand sizes, even decoding complex tasks like American Sign Language.

Stay Ahead in Humanoid Robotics

Get the latest developments, breakthroughs, and insights in humanoid robotics — delivered straight to your inbox.

Solving the Data Bottleneck

This development arrives as the industry shifts toward a "data-first" engineering philosophy. As we have seen with Sunday Robotics, the ability to collect high-fidelity human movement data is often more important than the robot hardware itself. Sunday Robotics notably iterated on their "Skill Capture Glove" 100 times before finalizing their robot's body, arguing that data is the primary bottleneck preventing robots from entering production.

MIT’s wristband could significantly lower the barrier to this data collection. In demonstrations, the researchers used the band to wirelessly control a robotic hand to play piano and shoot a desktop basketball hoop. Because the device is wearable and non-obstructive, it could allow human operators to provide training demonstrations more naturally than current exoskeleton-based systems.

For instance, Sharpa Robotics recently demonstrated autonomous apple peeling using a shared-autonomy "copilot" and an exoskeleton to manage the 63 DoF of their SharpaNorth humanoid. While successful, such setups are cognitively demanding for the operator. A wearable ultrasound tracker could theoretically offer a more intuitive "marionette" interaction, allowing operators to focus on the task rather than the interface.

From VR to Surgery

The research team envisions the wristband serving as a universal interface for both virtual and physical environments. In VR tests, users were able to "pinch" virtual objects to zoom and manipulate them smoothly. Professor Zhao suggests that the technology could have an immediate impact in replacing existing hand-tracking techniques in AR/VR headsets.

However, the most ambitious application lies in the "Embodied AI" market. By gathering massive datasets of hand motions from a wide range of users, the researchers hope to train humanoid robots for high-stakes dexterity tasks, such as robotic surgery.

Technical Hurdles and the Road Ahead

While the results are promising, the system is not yet a consumer-ready product. The current prototype involves a smartwatch-sized sensor paired with onboard electronics roughly the size of a cellphone. The team’s next priority is further miniaturization and training the AI on an even broader range of gestures to ensure "zero-shot" usability for any wearer.

Furthermore, while the system tracks 22 DoF of the hand, it lacks the haptic feedback integrated into high-end professional tools like the Dutch-based Manus system. For tasks requiring delicate force control—where an operator needs to feel the "resistance" of an object to avoid applying "infinite force"—the lack of tactile cues remains a hurdle for high-fidelity teleoperation.

Nonetheless, MIT’s "puppet string" approach represents a significant step toward a world where controlling a $100,000 humanoid or a virtual scalpel is as simple as putting on a watch.


Watch MIT News about the device below:

Share this article

Stay Ahead in Humanoid Robotics

Get the latest developments, breakthroughs, and insights in humanoid robotics — delivered straight to your inbox.