“The perception of system output must be greater than the perceived user input, and within that range there is a sweet spot where [user experience] is delightful and magical.”
I can still recall the first time I saw an Augmented Reality demo. There was a sense of wonderment from the illusion of 3D models living within the video feed. Of course, the real magic was the fact that the application was not only viewing its surrounding environment, but also understanding it. AR has proven to be an incredible tool for enhancing perception of the real world. Despite this, I’ve always felt that the technology was somewhat limited in its application. It is typically implemented as output in the form of visual overlays or filters. But could it also be used for user input? We decided to explore that question by pairing the principles of AR (like real-time marker detection and tracking) with a natural user interface (specifically, touch on a mobile phone) to create an entirely new interactive experience.
The outcome of this examination is something we’re calling Touch Vision Interface – a tool that enables touch interaction on many different connected surfaces through a mobile phones’ camera view. The translation of touch input coordinates to the captured video feed creates the illusion of being able to directly manipulate a distant surface. As a result, the interaction feels natural and almost invisible.
Future applications of this technology are more than compelling. The barrier of cross device communication is lessened, whether in the living room or in large open spaces. Brands could crowd-source easier with billboard polls. Group participation on large installations could feel more natural. The possibilities become even more exciting when considering the most compelling aspect of the tool – the ability to interact with multiple surfaces without interruption. No need to switch devices through a secondary UI – simply touch your target. You could imagine a wall of digital billboards that users seamlessly paint across with a single gesture.
Other applications could enhance the collaborative creative process. Imagine a music creation experience where each screen becomes an instrument. Pitch and tone could be changed by moving your finger across the instruments surface allowing for natural gesture controls. Rather than being paired with an instrument, a band member simply plays what they’re “looking” at.
There are flaws with this technology, surface discovery and pairing being the most obvious. But along with the problems, it’s easy to see how this idea could be extended – moving past simple planar interaction and deeper into the world of real-world object manipulation. We’re excited to see how far we can take it.