When Apple’s digital assistant – Siri was launched, I was having a very interesting conversation on gesture-based navigation with a friend. At the time, I had emailed him the following:
I believe the future of digital interaction is not via voice. There are too many variables involved – sound clarity, speed, accent and cultural differences in speech – in order for it to become a standardised communication platform.
The future has to be based on gesture-based controls. The current controls – pinch & swipe – are primitive. Imagine the kind of interactions that can be performed if we use the full range of human hand signals to communicate with a device, a la Minority Report type interface.
That was a little over 6 months ago. Today, I stumbled onto this article that showcases technology that could make the above commercially viable. Using the researchers’ new technology, called Touché, we can sense what is touching any object – human or fork, how it is being touched – pushing, pinching, or grasping and which body part is touching it – hands, elbows, and number of fingers. That means a flat surface could recognise if you are standing, sitting, or tebowing on it.
Here is the video showcasing this technology: