We all want the hottest handsets, sweetest specs, awesomest apps, greatest games and next best thing in the tech world – Android included. Sometimes we get so lost in what is “cool” and “fun” that we lose site of reality and practicality. A couple days ago, Google made a post on their Official Blog about more accessibility features on Android 1.6. But the hoopla surrounding new 1.6 features had come and gone and the blog post fell on cutting room floors across the interwebz.
What a shame. That post by Google illustrated that not only can the technology behind Android keep us more connected, entertained and informed than ever before, but in some cases it can drastically improve people’s lives.
Android 1.6 introduced a new feature called Text-To-Speech (TTS for short or Speech Synthesis alternatively) that allows your phone to read, translate and audibly speak text in various languages. Through various APIs, developers can integrate this functionality into their applications. My first thought was that hey, that would be great for traveling to a foreign country! An American in Italy that is looking for directions somewhere? Ask someone in English and have an Android App repeat it aloud in Italian. And guess what? They could speak the directions back in Italian and have it direct you in English!
Sure, that would be cool but I would consider it a luxury. And that is what Android is doing in many cases – offering a huge number and wide range of luxuries. Unfortunately my narrow-minded thinking prevented the “aha” moment for a group of people who could find much better use for this technology – the blind and visually impaired.
For one, developers can take a few extra measures to ensure that blind and visually impaired users are still able to utilize their applications. But perhaps more importantly, these new features of Android allows applications to be built from the ground up for the explicit PURPOSE of being utilized by blind and visually impaired users. We’ve told you about the Eyes-Free Android Application Project before… but it deserves another plug.
Here is a quick example illustrating innovative ways of approaching applications for “eyes-free” use:
That was published 6 months ago and there haven’t been any NEW videos from the project since. But given the recent additions in Android 1.6 I’m hoping we’ll see some new videos soon, not to mention Google told us to stay tuned. Here are some of the “enhancements” Android 1.6 has allowed that we might see in upcoming Eyes-Free application concepts:
- Text-to-Speech (TTS) is now bundled with the Android platform. The platform comes with voices for English (U.S. and U.K.), French, Italian, Spanish and German.
- A standardized Text To Speech API is part of the Android SDK, and this enables developers to create high-quality talking applications.
- Starting with Android 1.6, the Android platform includes a set of easy to use accessibility APIs that make it possible to create accessibility aids such as screenreaders for the blind.
- Application authors can easily ensure that their applications remain usable by blind and visually impaired users by ensuring that all parts of the user interface are reachable via the trackball; and all image controls have associated textual metadata.
- Starting with Android 1.6, the Android platform comes with applications that provide spoken, auditory (non-speech sounds) and haptic (vibration) feedback. Named TalkBack, SoundBack and KickBack, these applications are available via the Settings > Accessibility menu.
Heading up the Eyes-Free project is gentleman named T.V. Raman and in New York Times article, he shares with us a few visions that will help Android act as eyes for the blind:
With no buttons to guide the fingers on its glassy surface, the touch-screen cellphone may seem a particularly daunting challenge. But Mr. Raman said that with the right tweaks, touch-screen phones — many of which already come equipped with GPS technology and a compass — could help blind people navigate the world.
“How much of a leap of faith does it take for you to realize that your phone could say, ‘Walk straight and within 200 feet you’ll get to the intersection of X and Y,’ ” Mr. Raman said. “This is entirely doable.”
Forget the hottest handsets, sweetest specs, awesomest apps and greatest games… the notion of truly helping somebody overcome real-life challenges – whether it be about the blind, visually impaired or any other real-life situation – might just be the next best thing in the tech world. When you put it into perspective, doesn’t it kind of make you feel foolish for arguing on message boards about which Flashlight app is best? Thought so.
The problems are widespread and two examples provided in the January 2009 article have since been addressed by Google:
- One obstacle for web browsing is captchas – distorted strings of letters the user must enter to assure they aren’t a spam bot. Not many of these offer “audio” versions of Captchas, making it impossible for blind people to bypass spam filters. Last month, Google purchased the company who owns the Captcha technology for an undisclosed sum. Hmmmm I wonder why?
- The rapid increase in bandwidth capabilities has pushed online video to new heights – but what about the deaf and hearing impaired? In March, Google added features to YouTube captions allowing them to be embedded on other sites and translated into 40+ languages, helping everyone receive better access to the information they want and need. Yesterday Google completed transcribing 150+ videos from their Webmaster Video Channel into captions, providing 11 hours of information to those that otherwise wouldn’t have had it. And in 40+ languages mind you!
The above two elements could DEFINITELY have implications on an Android Mobile phone as well and I’m eager to see what all the gifted developers out there can come up with. Best of luck to the Eyes-Free Project team and we’ll have our eyes bots on you! If this really interests you, check out the 47-minute presentation from Google I/O 2009 below:
Not able to view the videos at work but this seems like a huge step in the right direction and so much more important then “my phone can read the barcode on my grocery store club card”.
I also think these apps could help someone who is not able to speak. I know they could simply write down a message on a piece of paper but what if that message is to a child that can’t read or to a blind person? They could simple type the message and the android phone would do the talking for them.
Cool! My cousin is blind and I’ve often thought that Android could end up being a good platform for accessibility.
I recently gave a thought to QR codes, and just imagine what could an Android do for visually impaired people! Just accept QR codes widely in the cities, and your phone whould be able to tell you where you are, which store you’re in front of, and what this restaurant has on special today. I’m slowly becoming a visually impaired, one eye is almost blind, and all this gives me hope I won’t be left out when the dark day comes.
ingenuous !
@Brian – would that not rely on being able to see the QR code?
Maybe an augmented view of what you’re looking at would be better. A lot of augmented reality apps are currently available. You could just add some text-to-speech onto the view to describe what it is you’re looking at. Would be cool to integrate this into maps and satnav stuff too. E.g. imagine walking along, not knowing where you are. You switch to the augmented map, point it out in front of you and get a description of what is in front (e.g. Burger King). You can then get directions to a particular point on a map (e.g. a statue or point of interest)… or something like that. Just a quick idea I had.
@Brian – one thing I did think of using QR codes for is bus timetables. Some stops here in UK don’t have a timetable to hand. YOu have to guess the times, or simply wait. If the bus company had a QR code on the stop, you could scan it, find out which bus you want and what time it arrives. Building on this, you could access google maps and use the gps to find your nearest stop from your current location, tell it where you want to get off, get times of buses you need to catch, and be notified when you want to get off when actually on the bus. This could also be done without the QR codes though, for visually impaired or sighted people. This could also be linked into train and metro services.
check out also a cool new app on Market, Webtalk – voice enabled internet browser… helps greatly for visually impaired, have it read aloud any article, say on cnn, yahoo, cbs etc – so you are free to do other suff while listening.. check also youtube:
http://www.youtube.com/watch?v=c6KTrkmiCg8
I’m totally blind and i’m trying to decide whether to get an Android phone or an iPhone. From everything I’ve read so far, I’m leaning toward an Android phone. It sounds like the Android has a lot more apps for blind people than the iPhone with more on the way! Yay!
Recently I read that Google is planning to unvail it’s own music service to compete with Itunes. Awesome!
I’d like for android app developers to develop an app that can give a description of a grocery store or some other building so that a blind person can get somewhat oriented to an unfamiliar place. That would require that there be some sort of database of information about the layouts of buildings. I think Google could find a way to do that. Google would have to get access to the floor plans of buildings or something like that.
I’ve read that Android phones are cheaper than iPhones. Is that true? If anyone can come up with any other reasons why I should get an Android phone, please post them here. I want a phone/PdA with the best and the most accessible apps for the blind. The apps will definitely help people like me be more independent and productive.