This AI-Powered Sensor Recognizes Hand Gestures, Have Potential Use in Prosthetics

Researchers at the University of California, Berkeley have developed a wearable sensor that...

The flexible electrode array is printed with conductive silver ink on PET substrate. Photo credit: UC Berkeley

Researchers at the University of California, Berkeley have developed a wearable sensor that combines artificial intelligence (AI) software to help recognize what hand gesture a person intends to make based on electrical signal patterns in the forearm. The device could one day be used to control prosthetics or to interact with almost any type of electronic device.

Read more: NUS Scientists Develop Electronic Skin with Exceptional Sense of Touch for Prosthetics

“Prosthetics are one important application of this technology, but besides that, it also offers a very intuitive way of communicating with computers.” said Ali Moin, who helped design the device as a doctoral student in UC Berkeley’s Department of Electrical Engineering and Computer Sciences. “Reading hand gestures is one way of improving human-computer interaction. And, while there are other ways of doing that, by, for instance, using cameras and computer vision, this is a good solution that also maintains an individual’s privacy.”

Moin is co-first author of a new paper describing the device, which appears online in the journal Nature Electronics.

The research team has demonstrated that the hand gesture recognition system can classify up to 21 different hand signals, including a thumbs-up, a fist, a flat hand, holding up individual fingers and counting numbers, reports Kara Manke in Berkeley News.

To create the hand gesture recognition system, the team collaborated with Ana Arias, a professor of electrical engineering at UC Berkeley, to design a flexible armband that can read the electrical signals at 64 different points on the forearm. The electrical signals are then fed into an electrical chip, which is programmed with an AI algorithm capable of associating these signal patterns in the forearm with specific hand gestures.

The team succeeded in teaching the algorithm to recognize 21 individual hand gestures, including a thumbs-up, a fist, a flat hand, holding up individual fingers and counting numbers.

“When you want your hand muscles to contract, your brain sends electrical signals through neurons in your neck and shoulders to muscle fibers in your arms and hands,” Moin said. “Essentially, what the electrodes in the cuff are sensing is this electrical field. It’s not that precise, in the sense that we can’t pinpoint which exact fibers were triggered, but with the high density of electrodes, it can still learn to recognize certain patterns.”

Like other AI software, the algorithm has to first “learn” how electrical signals in the arm correspond with individual hand gestures. To do this, each user has to wear the cuff while making the hand gestures one by one.

However, the new device uses a type of advanced AI called a hyperdimensional computing algorithm, which is capable of updating itself with new information.

For instance, if the electrical signals associated with a specific hand gesture change because a user’s arm gets sweaty, or they raise their arm above their head, the algorithm can incorporate this new information into its model.

Another advantage of the new device is that all of the computing occurs locally on the chip: No personal data are transmitted to a nearby computer or device. Not only does this speed up the computing time, but it also ensures that personal biological data remain private, UC Berkeley reported.

The device is not ready to be a commercial product yet. But it could likely get there with a few tweaks, according to Jan Rabaey, the Donald O. Pedersen Distinguished Professor of Electrical Engineering at UC Berkeley and senior author of the paper.

Read more: Electronic Skin for Prosthetic Hands Lets Amputees Feel Pain

“Most of these technologies already exist elsewhere, but what’s unique about this device is that it integrates the biosensing, signal processing and interpretation, and artificial intelligence into one system that is relatively small and flexible and has a low power budget,” Rabaey said.

Sam Draper
January 7, 2021

Innovation of the Month

Do you want to discover more, visit the website
Visit Website

Other news

Wearables Of Tomorrow Will Be Your Tool For Personalized Diagnostics

Researchers at Imperial College London and the University of Freiburg have proposed that wearable...

Galaxy Wear App Update Reveals Samsung Is Working On Galaxy Buds Pro Successor

With the Galaxy Wearable App, Samsung has a central solution for all of the manufacturer's wear...

Nordic Semiconductor Launches Bluetooth 5.1 SoC for Multi-Protocol Applications

Nordic Semiconductor, an Oslo-based fabless semiconductor company, announced the...

VRSim’s Cutting Edge VRNA EMS Training Product

VRNA EMS provides immersive training for Emergency Medical Professionals and First Responders.
Discover more