There has always been a bit of excitement around here for hands-free gestures on devices. A lot of that is already possible with the most notable example being eyeSight. But this technology doesn’t need a camera – all it needs is the presence of your hand.
The display technology by Noalia uses electrostatic fields to track your finger movement. The demo showed examples of handwriting and a simple visualized pointer to show where your finger was being tracked. While I doubt this solution, like eyeSight, is ready enough for practical use, I hope someone can take this technology and put it to good use.
I struggled to find some use cases of where I might need to keep my hands off of my tablet, but seeing a plate of hot wings passing by reminded me that I often do use my phone and tablet while eating. Messy fingers no longer an option? Yes, please. See the video demo above.
When is this great piece of tech coming out? I can’t wait to get my hands on this…
Or, rather, *not* get your hands on it. :P
Yeah well not literally…;-)
he is so very careful ^_^
Perhaps it could be used to capture the hand motions of Sign Language and then software could interpret the meanings? This would be a great tool to bridge the gap between the Deaf and hearing. Interpreters are not always available and writing or typing is a drag.
I am not seeing it…Sure it is cool but unless it is greatly improved it appears far slower than touched based gestures actually hindering productivity just move the finger two inches lower and touch the damn screen.
I am sure companies can do some interesting things with this tech in the future but that demo certainly did not wow me
One thing I would LOVE if they implemented this technology, is if they picked up and used “hovering”. What I mean is, in windows, stuff will show “tooltips” if you hover over it with your mouse cursor, however, on my phone, you either touch it or you don’t. The phone has no idea where you are until you touch it, until now….?
They could use this technology to automatically display tips when you hover over something, or maybe just highlight what you’re about to tap on, there are a lot of uses, just think of how windows already utilizes people “hovering” over and item/button/whatever. :-)
Yeah anything wher a tablet would be useful but you have messy hands or fingers. Massage therapists, cooks, some artists, butchers, lots of people in the food industry, etc Many professions would find tablets useful but for now are made impractical by messy hands, environments.