This Wearable Lets Blind People See

NUS researchers have developed AiSee, a wearable device that tells blind users what they're holding.

Image credits: NUS

For those who are vision impaired, even the seemingly basic task of grocery shopping can be difficult because it can be difficult to distinguish various things.

In order to help them with this and other endeavors, a group of researchers from the School of Computing at the National University of Singapore (NUS Computing) unveiled AiSee, a reasonably priced wearable assistive device that uses artificial intelligence (AI) based on Chat GPT-4 to help people with visual impairments "see" objects around them. In contrast to the majority of wearable assistive devices that need to be paired with a smartphone, AiSee functions as a stand-alone system that doesn't require any other devices to function.

Read more Eyeglass/Contact Lens Prescription Device

A group of scientists at the National University of Singapore have been working on AiSee for the past five years. It appears to be a standard pair of bone-conduction earphones connected by a band that wraps around the back of the wearer's neck. The main goal of the technology is to prevent users from feeling self-conscious, which could happen if they were sporting more recognizable gear, like "smart glasses."

While the other earphone features an external touchpad interface, the forward-facing 13-megapixel camera on one of the earphones records images of the user's surroundings. On the rear of the gadget, which has a wireless internet connection, are a CPU and a lithium battery, reports Ben Coxworth in New Atlas.

Upon picking up an item, say during a grocery buy, the user uses the integrated camera to snap a picture of it. Cloud-based AI algorithms process that image in real time, examining information like the item's size, color, and shape as well as any text found on its labels.

A synthetic voice in the headphones notifies the user of the match if it is for a known object. If they need additional details, they can ask out loud, and maybe the AI will be able to provide.

Crucially, AiSee does not require a connection to a smartphone or any other device, which makes things even easier. Users can still hear their surroundings since the bone-conduction earphones do not completely cover their ears.

Scientists are currently working on improving object identification algorithms, increasing processing speed, and making the technology more ergonomic and inexpensive.

Sam Draper
February 16, 2024

Innovation of the Month

Do you want to discover more, visit the website
Visit Website

Other news

Maxim’s New Wrist Form Factor Reference Design Reduces Health Wearable Development Time By Six Months

Maxim Integrated's new health sensor platform 3.0 (HSP 3.0) reduces the development time of...

Imec’s Ultrasound Sensor Assesses Arterial Stiffness, A Risk Marker For Cardiovascular Diseases

Imec, a Belgium-based R&D and innovation hub, active in the fields of nanoelectronics and ...

Empatica’s Embrace Plus Smartwatch Will Monitor Health of Astronauts Aboard Mars Mission

Empatica, an MIT Media Lab Spin off that develops medical wearables using AI...

Novartis Partners With App Maker Smartpatient to Help Patients with Wet Macular Degeneration

Novartis is partnering with German digital health company smartpatient to launch a new app...
Discover more