Google ended its I/O presentation with a big surprise. The company showed off its AR glasses that has the ability to translate languages right in front of your eyes.
The glasses use augmented reality and artificial intelligence (and possibly embedded cameras and microphones) to see someone speaking to you, hear what they're saying, translate it and display the translation live on the embedded, translucent screens built into the eyeglass frames, reports TechRadar.
"Language is so fundamental to communicating with each other," explained Google CEO Sundar Pichai, but he noted that trying to follow someone who is speaking another language "can be a real challenge."
Google didn’t share any details about when they might be available and only demonstrated them in a recorded video that didn’t actually show the display or how you would interact with them. But what was shown in the video painted a very cool picture of a potential AR future.
In one demo, a Google product manager tells someone wearing the glasses, “You should be seeing what I’m saying, just transcribed for you in real time — kind of like subtitles for the world.” Later, the video shows what you might see if you’re wearing the glasses: with the speaker in front of you, the translated language appears in real time in your line of sight.
Related Google’s Pixel Watch Coming This Fall, Here’s What We Know So Far
One of the most interesting parts of its new glasses initiative is a focus on practical utility. The ability to understand and be understood is actually useful. These glasses aren't focusing on floating dinosaurs or magic experiences; they're trying to assist. Meta's recent smart glasses ambitions also aim at providing utility, but Google's experience and tools seem well suited for the challenge.
It’s unclear if Google’s glasses will ever hit the market, but the prototype provides a sense of where Google thinks augmented reality can be helpful.