Skip to main content
G3ict

Alexa’s Show and Tell Feature IDs objects for Persons with Visual Impairment

September 23, 2019

For people with vision impairments, figuring out what's in a can or jar of food without opening it can be difficult or impossible. Amazon thinks it has a solution to that and other daily challenges that its blind and low-vision users face. Today, the company unveiled a new Show and Tell feature that allows users to hold an item in front of an Echo Show and ask "Alexa, what am I holding?" Using computer vision and machine learning for object recognition, the Alexa-powered device will respond with its best guess.

Amazon says the idea for Show and Tell came about from user feedback, and the company worked with the Vista Center for the Blind and Visually Impaired in Santa Cruz, California, on research and development. Amazon hopes the tool will help users with everyday tasks, like cooking, unpacking groceries and identifying objects around the house. "It's a tremendous help and a huge time saver because the Echo Show just sits on my counter, and I don't have to go and find another tool or person to help me identify something. I can do it on my own by just asking Alexa," said Vista Center assistive technology manager Stacie Grijalva.

Amazon recently made another accessibility update -- allowing users to ask Alexa to speak as quickly or slowly as they need. But it's certainly not the only tech company working to integrate accessibility functions into voice-controlled products. Apple has updated its Voice Control system. Comcast built an eye-controlled remote, and Google released how-to videos explaining its Assistant's accessibility features.

The Show and Tell feature is available in the US on first- and second-generation Echo Show devices. To activate the feature, users can simply ask, "Alexa, what am I holding?"

Source: Engadget