Next time you’re scrolling through your phone, take a moment to appreciate this feat: A seemingly mundane task is possible because of the coordination of 34 muscles, 27 joints, and more than 100 tendons and ligaments in your hand. Actually, our hands are the most agile part of our body. Mimicking their many subtle gestures has been a long-standing challenge in robotics and virtual reality.
Now, MIT engineers have designed an ultrasound wristband that accurately tracks the wearer’s hand movements in real time. As the hand moves, the wristband creates ultrasound images of the muscles, tendons and ligaments of the wrist, and is paired with an artificial intelligence algorithm that continuously translates the images into the corresponding positions of the five fingers and the palm.
Researchers can train the wristband to sense the wearer’s hand movements, which the device can transmit in real time to a robot or virtual environment.
In demonstrations, the team has shown that a person wearing the wristband can control a robotic hand wirelessly. As the person points or gestures, the robot does the same. In a type of wireless marionette interaction, the wearer can manipulate the robot to play a simple tune on the piano and shoot a small basketball into a desktop hoop. With the same wristband, the wearer can also manipulate objects on a computer screen, for example pressing their fingers together to make a virtual object larger and smaller.
The team is using wristbands to collect hand movement data from many more users with different hand sizes, finger shapes and gestures. They envision creating a large dataset of hand movements that could be combined to train humanoid robots in dexterity tasks like performing certain surgical procedures, for example. Ultrasound bands can also be used to capture, manipulate, and interact with objects in video games, design applications, or other virtual settings.
“We think this work has immediate impact in potentially transforming hand tracking technologies with wearable ultrasound bands in virtual and augmented reality,” says Xuanhe Zhao, the Uncas and Helen Whittaker Professor of Mechanical Engineering at MIT. “It can also provide massive amounts of training data for intelligent humanoid robots.”
Zhao, Gengxi Lu and their colleagues present the new design of the wristband in a paper published today Nature Electronics. His MIT co-authors are former postdocs Xiaoyu Chen, Shukong Li and Bolei Deng; graduate students Seonghyeon Kim and Dianne Lee; postdocs Shu Wang and Runze Li; and Ananth Chandrakasan, MIT provost and Vannevar Bush Professor of Electrical Engineering and Computer Science. Other co-authors are graduate students Yushun Zheng and Junhang Zhang, Baoqiang Liu, Chen Gong and University of Southern California professor Qifa Zhou.
watch the wire
There are currently several approaches to capturing and mimicking human hand dexterity in robots. Some approaches use cameras to record a person’s hand movements as they manipulate objects or perform tasks. Others involve a person wearing a glove with sensors that record the person’s hand movements and transmit the data to a receiving robot. But building a complex camera system for different applications is impractical and suffers from visual constraints. And gloves equipped with sensors can limit a person’s natural hand movements and sensations.
The third approach uses electrical signals from wrist or arm muscles that scientists associate with specific arm movements. Researchers have made significant progress in this approach, although these signals are easily affected by noise in the environment. They are not sensitive enough to recognize subtle changes in movements. For example, they can understand whether the thumb and forefinger are stuck together or pulled apart, but not much in the way.
Zhao’s team wondered whether ultrasound imaging could capture more dexterous and continuous hand movements. His group is developing a variety of ultrasound stickers – miniature versions of transducers used in doctor’s offices that are paired with hydrogel materials that can safely stick to the skin.
In their new study, the team incorporated an ultrasound sticker design into a wearable wristband to continuously image the muscles and tendons in the wrist.
“The tendons and muscles in your wrist are like the strings pulling the puppets that are your fingers,” says Lu. “So the idea is: Every time you take a picture of the position of the stars, you’ll know the position of the hand.”
mapping manipulation
The team designed a wristband with ultrasound stickers that is the size of a smartwatch, and added electronics that are as small as a cellphone. They attached the wristband to a volunteer’s wrist and confirmed that the device produced clear and continuous images of the wrist as the volunteer moved their fingers in various gestures.
The challenge then was to relate the black and white ultrasound images of the wrist to the specific position of the hand. As it turns out, the fingers and thumb are capable of 22 degrees of freedom, or extending or angling in different ways. The researchers found that they could identify specific regions in their ultrasound images of the wrist that corresponded to each of these 22 degrees of freedom. For example, changes in one area are related to extension of the thumb, while changes in another area are related to movements of the index finger.
To establish these connections, a volunteer wearing a wristband would move their hand in various positions while researchers recorded the gestures with multiple cameras surrounding the volunteer. By matching changes in certain areas of the ultrasound images with the hand positions recorded by the cameras, the team could label image regions of the wrist with corresponding degrees of freedom in the hand. But doing this translation continuously and in real time would be an impossible task for humans.
So, the team turned to artificial intelligence. They used an AI algorithm that can be trained to recognize image patterns and correlate them with specific labels, and in this case, different degrees of freedom of the hand. The researchers trained the algorithm with ultrasound images, which they carefully labeled, annotating image regions associated with a specific degree of freedom. They tested the algorithm on a new set of ultrasound images and found that it correctly predicted the corresponding hand gestures.
Once the researchers successfully combined the AI algorithm with the wristband, they tested the device on more volunteers. For the new study, eight volunteers with different hand and wrist sizes wore wristbands while they made various hand gestures and grips, including making signs for all 26 letters in American Sign Language. He also had items like a tennis ball, a plastic bottle, a pair of scissors and a pencil. In each case, the wristband accurately tracked and predicted hand position.
To demonstrate potential applications, the team developed a simple computer program that they paired wirelessly with the wristband. As the wearer goes through pinching and holding motions, the gestures correspond to zooming in and zooming out of an object on a computer screen and virtually moving and manipulating it in a smooth and continuous manner.
The researchers also tested the wristband as a wireless controller of a simple commercial robotic hand. While wearing the wristband, a volunteer went through the motions of playing a keyboard. The robot in turn imitated the movements in real time to play a simple tune on the piano. The same robot was also able to mimic the taps of a person’s fingers to play a desktop basketball game.
Zhao plans to further miniaturize the wristband’s hardware, as well as train the AI software on multiple gestures and movements from volunteers with a wide range of hand shapes and sizes. Ultimately, the team is working toward a wearable hand tracker that anyone can wear, to wirelessly manipulate humanoid robots or virtual objects with high dexterity.
“We believe this is the most advanced way to track skilled hand movements through wearable imaging of the wrist,” says Zhao. “We think these wearable ultrasound bands can provide intuitive and versatile controls for virtual reality and robotic hands.”
This research was supported in part by MIT, the US National Institutes of Health, the US National Science Foundation, the US Department of Defense, and the Singapore National Research Foundation through the Singapore-MIT Alliance for Research and Technology.