Have you ever worn a wearable headset? Have you tried Google Glass or any of its myriad competitors? After the initial (and deserved) sense of wonder and awe wore off from perceiving digital and virtual content overlaid on the real world, have you found yourself strangely frustrated at just using the device itself?
You wouldn’t be alone – smartphone companies (Apple chiefly among them) have labored diligently to irrevocably addict you to touchscreens and touch interfaces. Not unkindly, touchscreens have largely replaced mobile keyboards and are largely to thank for the meteoric rise and massive adaptation of smart devices. But insidiously secreted away amid marketing language and shiny rectangles is the sentiment that gestures like “pinch-to-zoom” and “swiping” are only natural- if not pure instinct.
Imagine an iPhone- without a touchscreen. Imagine a tablet, and no amount of swiping or pinching will allow you to manipulate its contents. This is the reality of wearable computing and augmented reality devices – they’ve removed the necessity of touch. But then how to use an application more than passively? How to navigate to a different screen?
Wearable augmented reality devices rely on vision to display content. There are already forays into voice navigation (along with infuriating buttons and swiping motions on the glasses themselves; some clever companies are utilizing Low Bluetooth Energy to pair companion smartphones or new devices like the Enimai “Nod” companion “ring” to activate in-app features. So what then – projectors, 3-D cameras for “finger tracking”? It’s hard to imagine a future where everyone is wearing AR glasses while obnoxiously yelling commands and waving their hands around in front of their faces or furiously trying to dial phone numbers on their hands.
Okay- so it’s not that difficult.
But what if we could bypass all of that? What if we could use camera technology to get even cleverer with reality interaction? Enter Thermal Touch – a technology that will enable interaction with nearly any object or surface.
Thermal Touch – Turning your whole world into a touchscreen
Thermal Touch is a radical new approach to wearable headset graphical user interfaces (GUIs). It utilizes infrared cameras to register and track minute thermal imprints left by the heat signature of a finger. Touch your desk – you’re leaving imperceptible (and impermanent) heat maps each time your finger touches the surface. Combining a thermal camera with a normal camera, and developing AR tracking in conjunction with thermal heat tracking, Metaio can now turn anything into a touchscreen.
Trak Lord, Head of Metaio US Marketing, sat down with Daniel Kurz, lead Metaio R&D Engineer and creator of the Thermal Touch prototype, to talk about the future of human-computer interaction.
It was happy coincidence that we got our hands on a thermographic camera and played around with it in the lab. Our R&D team had already been tasked with developing natural and intuitive ways of interacting with Augmented Reality applications when using head-mounted displays. After measuring the temperature of my coffee mug and my display, and after discovering interesting temperature patterns in my face, I noticed that wherever I left my hand resting on the desk, residual heat would become apparent in the thermal image. Brief experiments with different objects showed that this is not a unique property of my desk but most objects exhibit warm spots after touching them. The camera module further included a visible light camera, which allows recognizing and tracking objects in its field of view. Putting one and one together, this is how the idea arose that the combination of detecting touches in the thermal image and detecting and tracking the touched objects in the visible image would enable a natural way to interact with those objects and digital information associated to it – particularly for wearable headsets.
Can you describe how we built the prototype?
Our mobile prototype is based on a tablet PC to which we attached a combined thermal and visible light camera module. The fixture is simply a joist hanger I bought at the next do-it-yourself store. Our proof-of-concept software implementation is based on the Metaio SDK and therefore features the latest tracking capabilities for dealing with both planar and three-dimensional objects. It further provides the functionality to render virtual objects registered with the tracked objects. We had to extend the Metaio SDK to support capturing images from the thermographic camera and I developed a prototypical touch detection algorithm. All in all it wasn’t really that much work, because most pieces already existed in our SDK. The last thing to do was then creating some exemplary applications to demonstrate the versatile opportunities this technology offers in different use cases.
Ideally, what will Thermal Touch look like in the future? How many years are we from embedded infrared cameras?
This new way of interacting with Augmented Reality is clearly meant for wearable computers and head-mounted displays. These devices become increasingly important not only in the context of Augmented Reality, and as they do not have touch screens and they leave the hands free, our technology is a perfect fit. We keep working on improving our prototype in terms of robustness and latency and we are looking into how this fundamental approach can allow more advanced interaction techniques. For example, touching an object with different fingers might have different effects. Of course, it will take a couple of years until the first head-mounted devices will include a thermographic camera. But the current trend clearly is that these cameras become available at a small form factor and an affordable price. A mobile phone add-on enabling mobile thermal imaging will become available this year, and this is only the beginning. Once wearables are really being used ubiquitously, their hardware should be ready for Thermal Touch.
Though it may be years ahead in the future, embedding infrared cameras into wearable computing is not beyond the realm of possibility, especially in an industry that is still iterating on form factor and hardware, let alone the ideal graphical user interface.