Read my hand
Imagine typing on a computer without a keyboard, playing a video game without a controller or driving a car without a wheel. This may soon be possible thanks to a device, developed by a team led by electrical engineering professor Jan Rabaey, that can recognize hand gestures based on electrical signals detected in the forearm. The system, which couples wearable biosensors with artificial intelligence (AI), could one day be used to control prosthetics or to interact with almost any type of electronic device.
The researchers — including lead authors Ali Moin and Andy Zhou — collaborated with Ana Claudia Arias, professor of electrical engineering and computer sciences,
to design a flexible armband that can read electrical signals at 64 different points on the forearm.
- The electrical signals from the armband are fed into an electrical chip, which is programmed with an AI algorithm capable of associating these signal patterns in the forearm with specific hand gestures.
- To teach the algorithm how electrical signals in the arm correspond with individual hand gestures, each user wears the cuff while making hand gestures one by one. The algorithm can recognize 21 individual hand gestures including a thumbs-up, a fist, a flat hand, holding up individual fingers and counting numbers.
- The device uses a hyperdimensional computing algorithm that can update itself with new information. For instance, if the electrical signals associated with a specific hand gesture change because an arm gets sweaty or a hand is raised above a user’s head, the algorithm can incorporate this new information into its model.
- All of the computing occurs locally on the chip. Not only does this speed up the computing time, but it also ensures that personal biological data remain private.
Learn more: High-five or thumbs-up? New device detects which hand gesture you want to make (Berkeley News)