The author of this video is a research scientist at Google but I wouldn’t be surprised if he was also a professor at some point in his career. His inquisitive and inclusive approach made me feel right back in a college classroom and the topic of conversation certainly seemed academic:
- What does it mean to even use a touchscreen/on-screen keyboard when you’re NOT looking at it?
- Why do you have to look at an on-screen keyboard?
- What if you didn’t have to?
- What if the buttons on the touchscreen positioned themselves dynamically around your fingers?
By questioning the status quo, T.V. Raman floats an interesting concept to the surface and illustrates one potential way it could be useful:
This is just the start of many Android Applications that will be “Eyes-Free” as they’ve started a Google Page for Speech Enabled Eyes-Free Android Applications. I’m sure this will find many uses in helping the blind navigate touchscreen phones but beyond that, no real problem/solution scenarios are jumping out at me. Nonetheless its a cool demonstration. And questioning the norm is always good… this is the type of outside the box thinking that produces innovation.
The two guys behind EyesFree announced their project more than 2 months ago:
They have 5 videos on the EyesFree YouTube account but nothing has been uploaded in a couple months… we would love an update guys!
What about people driving? Eyes-free mobile interface would help a lot of commuters and people that refuse to obey the law at least be a little safer while they’re doing their evil deeds (damn you vile driving texters! the state of California decries thee!) But ummmmm, yeah I can’t actually think of any real uses for this either. I bet there are some though, maybe it will become one of those things we never needed before and won’t be able to live without once we get it lol
Here is there session from Google IO:
http://code.google.com/events/io/sessions/LookingBeyondScreenTextSpeechAndroid.html