Apr 22nd, 2013

Some underlying code found hiding inside the Google’s “MyGlass” application is revealing some new clues on some new ways users will interact with Glass (once it’s finally made available to the public). Using eye-motion tracking, it appears that Google Glass might be able to tell when a user is winking (as opposed to just blinking), signaling the device to snap a picture with Glass’ integrated 5MP camera.

Discovered by a user on Reddit, the code strings “EYE_GESTURES_WINK_ENABLED,” “EYE_GESTURES_WINK_CALIBRATION_SUCCESS,” and “EYE_GESTURES_WINK_TAKE_PHOTO” suggest that eye gestures will be just one of the many ways users will be able to interact with their Glass you know, besides the old-school way of touching or speaking to it.

None of this really sounds too far fetched to us. Back at CES 2013, we were able to test out some new eye-tracking tech from a company called Tobii, even going as far as to speculate the possibilities of seeing something similar in Google Glass. Let’s not forget that Google was recently awarded a patent for interacting with a device using eye-tracking and although that dealt specifically with unlocking a device by looking at it, it’s easy to see Google definitely been toying around with the idea.

It’ll be interesting to see how these eye-gestures perform in everyday use. Other than spy pics, can anyone think of a useful reason why you’d want to snap a sneaky picture like this? Aside from submitting to PeopleofWalmart.com, my mind is drawing a blank.

local_offer    Google Glass