Published on

From Prosthetics to Pixels: PSYONIC and NVIDIA Bridge the Robotics "Data Gap" with Real-to-Real Transfer

P.A.
Written byP.A.

The quest for human-level robotic dexterity has long been hampered by a lack of high-quality interaction data. At GTC this week, San Diego-based PSYONIC announced a collaboration with NVIDIA to address this bottleneck by integrating the Ability Hand directly into NVIDIA Isaac Lab. The partnership introduces a "real-to-real" transfer framework designed to capture authentic human manipulation behavior to accelerate how robots learn to handle the world.

The First Commercial Dexterous Hand in Isaac Lab

By becoming a native asset within Isaac Lab, the Ability Hand—a bionic system originally designed as a high-performance prosthetic—is now available for researchers to simulate, train, and validate AI policies. This integration creates a unified pipeline between assistive bionics and embodied AI research, allowing developers to work with a sensorized hand that is already deployed in real-world human applications.

The Ability Hand is currently marketed as the fastest dexterous hand available and was among the first to provide touch feedback to users. Its integrated tactile sensing and durability make it a unique data-generation platform, capable of bridging the gap between human capability and robotic learning.

Solving the "Actionless" Data Problem

The core of the collaboration is real-to-real transfer. In a demonstration showcased by the companies, a human user wearing the Ability Hand performs a precise pipetting task. The data captured from this human interaction is then used to train various robotic platforms—including industrial arms, mobile robot dogs, and humanoids—to perform the same task.

A four-panel composite image showing a human, an industrial arm, a robot dog, and a humanoid robot all using the Ability Hand to perform the same pipetting task.
"Real-to-real" transfer in action: PSYONIC demonstrates how high-fidelity interaction data from a human user can be used to train identical manipulation policies across multiple robotic embodiments.

This approach seeks to bypass the limitations of purely synthetic data or "actionless" video. While NVIDIA’s recent DreamDojo world model uses 44,000 hours of human video to teach "intuitive physics," real-to-real transfer captures high-fidelity, physically grounded interaction data from the exact same hardware used on both humans and robots.

A man wearing a carbon-fiber PSYONIC Ability Hand prosthetic kneels outdoors, gently touching the face of a young girl who is smiling.
Beyond robotics: The Ability Hand was originally developed as a high-performance bionic limb, providing touch feedback and intuitive control for human users.

"Robots will learn the real world from humans," said Dr. Aadeel Akhtar, Founder and CEO of PSYONIC. "The lack of high-quality manipulation data is one of the biggest challenges in robotics, and this collaboration is about building that foundation."

Scaling Dexterity Across Embodiments

The integration supports a closed-loop workflow: simulation in Isaac Lab, deployment on physical hardware, human-guided data capture in the real world, and subsequent robotic training. This focus on human-derived data aligns with NVIDIA’s broader research into EgoScale, which found a predictable scaling law between the volume of human data and robotic success.

A close-up of a PSYONIC Ability Hand mounted on a silver robotic arm, delicately holding a single red raspberry between its thumb and index finger.
High-precision dexterity: The sensorized Ability Hand uses integrated tactile sensing to handle delicate objects like fruit without crushing them.

While models like DreamZero have shown that robots can generalize from diverse, non-repetitive data, the PSYONIC collaboration provides the high-resolution tactile feedback necessary for "master-level" dexterity. By using the Ability Hand as a universal end-effector, researchers can extract 21 keypoints of motion and retarget them across diverse robot morphologies, similar to the "cross-embodiment" capabilities seen in NVIDIA’s SONIC framework.

A carbon-fiber robotic arm attached to a mobile robot dog uses an Ability Hand to firmly grip and operate a red power drill against a wooden plank.
Rugged and capable: The Ability Hand's durability allows it to be used on diverse platforms, such as mobile robot dogs, for demanding physical tasks like power tool operation.

A New Standard for Robot Hardware?

The choice of the Ability Hand as a native Isaac Lab asset reflects an industry shift toward standardized, production-ready hardware. Much like how the Unitree G1 has become a "standard canvas" for loco-manipulation research, the Ability Hand’s commercial availability provides a consistent baseline for tactile AI research.

As the industry moves toward what researchers call the "GPT-2 moment" for robotics, the combination of PSYONIC’s sensory hardware and NVIDIA’s simulation stack suggests that the path to physical grace lies in a more robust sensory nervous system.

Share this article

Stay Ahead in Humanoid Robotics

Get the latest developments, breakthroughs, and insights in humanoid robotics — delivered straight to your inbox.