Screen Shot 2019-05-07 at 18.50.56

Google just improved accessibility of Android with Live Caption and Live Relay

Google is making leaps in artificial intelligence leveraging its Google Assistant to make Android more accessible. The company unveiled two new features on stage at Google I/O 2019 that change the way we interact with our devices.

Live Caption overlays any piece of sound or video content system-wide with the captions of the detected speech, allowing anyone to follow along with the audio of the content in text form. The feature is built natively into Android and accessed simply by hitting a dedicated button under the volume mixer and allows those who are hard of hearing to enjoy any content agnostic of where originates. This could be a text caption from a video on YouTube or, rather more impressively, speech detected from your media uploaded to Google Photos.

This technology is taken a step further with Live Relay, which allows a user who cannot or chooses not to speak to interact with a voice call utilizing Google Assistant. The demo showcased a user receiving a call and their responses typed into the keyboard and instantly translated and relayed back to the caller using Google Assistant.

Subtitles within videos aren’t anything new, but to seamlessly have it integrated at system-level and on-device means there is no latency and is accessible to everyone. The Live Relay feature to allow text read back to a caller using Google Assistant is a game changer and shows the power of Google’s AI capability and what the company is doing for accessibility.

Exit mobile version