Mojo Vision tested contact lenses with Artificial Intelligence and built-in Alexa Shopping. As he walks through the supermarket, his eyes will project the list of products he previously made, identifying them in real time and removing them as he chooses.
With your voice, you could add or remove items from the list that appear on the HUD (Head-Up Display) interface and identify them in real time at the stands, while pushing the cart and choosing each product. .
The user can browse and mark the items when they put them in the cart. If a family member adds an item to the list remotely, it will appear in the Mojo Vision glasses interface.
Mojo Vision is still discreet about the functionality of the device. The company notes in the statement: “The test integration shows how Mojo Vision could integrate Alexa’s voice AI with Mojo Lens’ unique and powerful eye-based interface.”
In the meantime, the test function seeks to know what other additional interfaces that work with voice, would improve this tool placed in the eyeball. Trying to input new items using the eye sounds like a nightmare, but that interface certainly makes sense for simpler tasks like scrolling.
It may interest you: ‘Terminator-type’ contact lenses would be a reality
The Amazon team helped Mojo implement Alexa Shpping for the test feature. “At Amazon, we believe experiences can be enhanced with technology that’s always there when you need it, but you never have to think about it,” Alexa Shopping List General Manager Ramya Reguramalingam said in a statement.
“We’re excited that with this addition of Alexa Shopping List, the art of what’s possible for smart, discreet, hands-free shopping experiences will be showcased.”
This Thursday a demonstration of this technology was made public during a Wall Street Journal event in southern California.