Researchers at Cornell University have created fiber-optic sensor that can detect sensations like pressure, strain, and bending, much like human skin. The sensor that combines low-cost LEDs and dyes, could give soft robotic systems – and anyone using augmented reality technology – the ability to feel the same rich, tactile sensations that mammals depend on to navigate the natural world.
Read more: Artificial Skin Could Enhance Sense of Touch and Provide Real-Time Haptic Feedback
The researchers published their findings in the journal Science. The research team was led by Rob Shepherd, associate professor of mechanical and aerospace engineering in the College of Engineering. The paper’s co-lead authors are doctoral student Hedan Bai ’16 and Shuo Li, Ph.D. ’20.
The project builds upon an earlier stretchable sensor, developed in 2016 in Shepherd’s Organic Robotics Lab. To create that sensor, researchers sent light through an optical waveguide, and a photodiode detected changes in the beam’s intensity to determine when the material was deformed, reports David Nutt in Cornell University.
For the new project, Bai drew inspiration from silica-based distributed fiber-optic sensors, which detect minor wavelength shifts as a way to identify multiple properties, such as changes in humidity, temperature and strain. However, silica fibers aren’t compatible with soft and stretchable electronics. Intelligent soft systems also present their own structural challenges.
“We know that soft matters can be deformed in a very complicated, combinational way, and there are a lot of deformations happening at the same time,” Bai said. “We wanted a sensor that could decouple these.”
Bai’s solution was to make a stretchable lightguide for multimodal sensing (SLIMS). This long tube contains a pair of polyurethane elastomeric cores. One core is transparent; the other is filled with absorbing dyes at multiple locations and connects to an LED. Each core is coupled with a red-green-blue sensor chip to register geometric changes in the optical path of light.
The dual-core design increases the number of outputs by which the sensor can detect a range of deformations – pressure, bending or elongation – by lighting up the dyes, which act as spatial encoders. Bai paired that technology with a mathematical model that can decouple, or separate, the different deformations and pinpoint their exact locations and magnitudes.
The new SLIMS sensors are simpler to make and can be easily integrated into systems, like incorporating into a robot’s hand to detect slippage.
Using this wearable technology, the researchers designed a 3D-printed glove with sensors running along each finger. The glove is powered by a lithium battery and equipped with Bluetooth so it can transmit data to basic software, which Bai designed, that reconstructs the glove’s movements and deformations in real-time, the Cornell report said.
Read more: H&M and Boltware Partnership Develops A Jeans Jacket That Mimics Touch
“Right now, sensing is done mostly by vision,” Shepherd said. “We hardly ever measure touch in real life. This skin is a way to allow ourselves and machines to measure tactile interactions in a way that we now currently use the cameras in our phones. It’s using vision to measure touch. This is the most convenient and practical way to do it in a scalable way.”
The researchers are now looking into the ways SLIMS sensors can boost virtual and augmented reality experiences.