InsideAR Speaker Spotlight: Alissia Iljaitsch presenting BMW INNOVATION LIVE for Google Glass

September 15, 2014

InsideAR Munich 2014

Welcome to our fifth edition of our InsideAR Speaker Spotlight series.

Alissia Iljaitsch, Executive Director Vectorform

Alissia Iljaitsch, Executive Director Vectorform

For our fifth edition on the InsideAR Speaker Spotlight we are proud to present Alissia Iljaitsch, Executive Director at Vectorform. Alissia’s career has taken her to the headquarters of leading automotive brands and with Vectorform she works closely with the design and development team tailoring digital experiences for companies such as Volkswagen AG, BMW AG and Microsoft Deutschland.

In the Speaker Spotlight she will talk about her latest project with Google Glass and her experiences with Augmented Reality. Let’s have a closer look!

Alissia, what are you going to present at InsideAR 2014?

Alissia Iljaitsch: I am very proud to be able to present our BMW INNOVATION LIVE. augmented reality case study for Google Glass. The object recognition of the BMW i8 was realized with the Metaio SDK. My presentation will allow a look behind the scenes of the design, development and project management process of the highly innovative experience.

Have you ever attended InsideAR before and what was the most exciting part for you at previous events?

AI: It will be my second time attending the InsideAR here in Munich. In August 2013 Vectorform started development for Google Glass and I was inspired by the opportunities the Metaio SDK offered for the new hardware. After the event our engineers and user experience specialist started evaluating the possibilities that would allow us to create a world class augmented reality experience for our clients. The BMW i8 product presentation in combination with Google Glass was the perfect fit for demonstrating the capabilities of the new device to the public audience.

vlcsnap-2014-07-24-11h55m05s89

What is the next challenge in AR that you would like to see overcome?

AI: With the introduction of data glasses augmented reality has the chance to take a big leap ahead, since the physical barrier of the mobile device is eliminated and the experience will feel more natural. However, it will be becoming more and more important that the user experience design is perfectly adjusted to the user´s environment to prevent a gimmicky look and feel of the superimposed content. In my speech at the InsideAR conference I will present Vectorform´s design principle of the Natural User Environment (NUE), which is a concept that focusses on factoring in the environment into wearable application design.

What is your personal vision of the future of AR? What Application do you wish you had today?

AI: The latency of object recognition is still a caveat. Users love an instant and quick response, but the speed for object recognition depends on a lot of external factors as light conditions, battery-life and processing power of the device. As the device capabilities improve we see huge steps towards providing the best experience for the end user.

The concepts for superimposing additional contents are infinite. Starting from the simple use of map applications for orientation to massive gaming scenarios.

Vectorform currently develops the “Nargoo” application for spatial orientation of dementia patients. I believe that these augmented reality applications will make true impact on society in the future, especially the medical space offers endless possibilities.

 

Thanks Alissa! We are looking forward to welcoming you at InsideAR Munich! Want to see Vectorform’s Google Glass demo? Watch the video below!


 InsideAR Munich takes place on October 29th and 30th, 2014 at the Kleine Olympiahalle in Munich. Learn more at http://www.insidear2014.com/

Need a ticket? Early Bird registration ends on 30th of September. Register here: http://www.insidear2014.com/register

 Interested in becoming a speaker or exhibitor? Get in touch with our team via: insidear@metaio.com.

Want to join us as a journalist or become a media partner? Send us an email: press@metaio.com.


Join our next webinar: “Engine Creative and Tesco Discover – AR Publishing for Brands”

September 3, 2014

Using Augmented Reality for printing products is nothing new. However, many brands have recently begun to recognize that AR technology can indeed bring additional value to catalogues, brochures and product packaging. Experimenting over the last several years, they have realized that the technology is now stable, fast and affordable -even for companies with smaller marketing budgets. With augmented catalogues and well thought-out AR applications, the technology can cover a wide range of possibilities: from delivering additional product information to costumers, to directly integrated online stores. Moreover, brands could easily turn their print material into an interactive showroom.

Engine Creative, a UK based company, created a very great example for Tesco, a multinational grocery and general merchandise retailer. The Tesco Discover App enhances several Tesco printing products via Augmented Reality and brings them to life. Engine Creative’s Chris Dun and Matt Key will join our next webinar this Thursday and will discuss the Tesco Discover platform in relation to their continued development with Metaio’s workflow and features. They will be approaching from both a technical and creative viewpoint; looking at the use and development of AR for brands. As examples, we will cover four AR campaigns from 2014 that have been developed to drive in-store shopper engagement, on packaging and across Tesco Publications.

Join us for the webinar this Thursday, September 4, 8am PDT/ 5pm CET and register here: Metaio Developer Portal.


InsideAR Speaker Spotlight: Ronald Azuma

September 1, 2014

InsideAR Munich 2014

Welcome to our third edition of our InsideAR Speaker Spotlight series.

Meet Dr. Ronald Azuma!

Meet Dr. Ronald Azuma!

For our third edition of the InsideAR Speaker Spotlight, we are happy to introduce Ronald Azuma of Intel Labs. There he leads a group that designs and prototypes novel experiences that exploit upcoming technologies. Known for defining the term “Augmented Reality,” Ronald has all written a research paper, “A Survey Of Augmented Reality,” that the single most referenced publication in the field of Augmented Reality and is listed as one of the “50 Influential Papers” by MIT Press. You can check out his website, but let’s see what he has to say about Metaio’s InsideAR and Augmented Reality:

What are you going to present at InsideAR 2014?

Ronald Azuma: I will talk about Leviathan, a series of AR demonstrations that Intel showed at CES 2014, with the support of Metaio, to inspire visitors about the potential for AR to enable new forms of storytelling. I will also discuss other strategies to make AR storytelling compelling.

Have you ever attended InsideAR before and what was the most exciting part for you at previous events?

RA: I attended InsideAR in 2012. One of the most exciting parts of this event is seeing detailed presentations about commercial usages of AR where the presenters provided details about the benefits and return on investment of using this technology in industrial and professional usages. I have rarely seen this information presented in other venues.

What is the next challenge in AR that you would like to see overcome?

RA: Many AR systems know very little about the surrounding real environment. Many know only the relative location of the AR display with respect to a marker or image, enabling augmentations only on or around that marker or image. What is needed is increased sensing, understanding, and incorporation of the surrounding real world, and making use of that information in our AR applications. One example of progress in this is the incorporation of depth sensors into platforms. The Intel RealSense SDK will run on PC’s and tablets that have depth sensors built in, and will incorporate Metaio’s 3D tracking capabilities to enable new types of AR experiences that track and sense complex 3D environments, without having to build models of such environments ahead of time.

What is your personal vision of the future of AR? What Application do you wish you had today?

RA: AR will not achieve its full potential until it becomes the basis of new forms of location-based media, enabling new forms of storytelling and immersive experiences that are different from movies, TV, and video games today. I would love to see new, compelling commercial applications that use AR as a medium for novel types of storytelling experiences. Achieving that will require developing new enabling technologies, new forms of experiences, and new business models.

Catch more of Ronald Azuma’s views on Augmented Reality and storytelling in his presentation “Augmented Reality Storytelling: Leviathan and Beyond” where he will explore Augmented Reality as the future of storytelling and a behind the scenes look into the Leviathan project.

That’s it from us for this week – be sure to visit last week’s spotlight and if you’d like to watch some of the presentations from last year’s InsideAR event, check out our InsideAR YouTube playlist.

 


InsideAR Munich takes place on October 29th and 30th, 2014 at the Kleine Olympiahalle in Munich. Learn more at http://www.insidear2014.com/

Need a ticket? Register here: http://www.insidear2014.com/register

 Interested in becoming a speaker? Get in touch with our team via: insidear@metaio.com.

You want to join us as a journalist or become a media partner? Send us an email: press [at] metaio [dot] com


Xenium Digital: Augmented Reality made in India

July 30, 2014

xenium-logoXenium Digital, a Metaio Certified Developer  from India, is constantly researching and developing new ways that Augmented Reality can be used to enhance consumer experience in the Indian Market. Thus, it is not a surprise to see many well-known clients from industries such as medicine, automobile, real estate and entertainment team up with Xenium to create AR on ground activations and mobile apps for their end consumers.

AR Therapy for Cancer Patients

AR_therapy_2The interactive “AR Therapy for Cancer Patients”  app is a perfect solution for medical professionals who are trying to promote their anti-cancer drugs at medical conferences. This dedicated AR app was designed with the Metaio SDK for the iOS platform, and is targeted at highly qualified doctors. The idea was to communicate the story behind the medicine in a better way and to provide the patients with a simpler way of understanding the way these anti-cancer drugs would affect them.

The app augments a mannequin -with an ID marker – into a real patient and uses case studies to illustrate the different diseases. Then, 3D models of the drugs were presented to demonstrate their efficacy at suppressing the disease.

From 3-D oil in action to TIGERTRAC

With regards to Augmented Reality in the automotive industry – an AR app using the Metaio AR platform was developed for Castrol. The app exclusively targets dealers and sales representatives to help them promote Castrol’s recently launched product, the Magnatec Start-Stop Engine Oil.

The AR app features the new 3D oil in action. First, through the help of 3D animation, the app brings the idea behind Castrol’s new label design to life. When the label of the oil bottle is scanned with an Android powered device, the app shows the special molecules that the oil is made of with Augmented Reality. It then follows up with a video of the product’s brand story. The app is designed to work perfectly on high-end devices.

Xenium Digital, along with Lakshya Media, also created a live augmented reality activation for TVS Tyres. Their objective was to promote their latest product, the AGRI RADIAL TYRE – TIGERTRAC, at the REIFEN 2014 fair in an innovative way.

The idea was to create an AR activation featuring the Royal Bengal Tiger walking among the tires and audience members. The tigers’ movements around the AGRI RADIAL TYRE were well choreographed so to provide realistic interaction with the visitors. The visitor could also take pictures with the tigers, which were later shared with them. Created with the state of the art Metaio SDK for AR and enhanced with realistic graphics and audio, the activity effectively captivated the audience.

Tiger_1

Lekar Hum Deewana Dil

With the release date of the movie “Lekar Hum Deewana Dil” quickly approaching, Eros International – with the expertise of Xenium Digital – wanted to promote it in the most interactive way possible. Their objective was to create a unique AR engagement platform for the audience to experience special moments with the actors. The end result was an on ground activation – created with the Metaio SDK – called Groove with Armaan and Deeksha. Visitors to Mumbai malls were able to show off their dance skills alongside the movie’s lead actors, Armaan and Deeksha, to the song “Khalifa”.

Watch the video here:


Vectorform shows visionary sales application for Google Glass

July 24, 2014

vlcsnap-2014-07-24-11h55m05s89

Hardly any technology in recent times has been as controversial to society  as Google Glass. Many have opinions, yet few have actually had the opportunity to experience the device first-hand. When it comes to “true” Augmented Reality, what is Glass really capable of? Vectorform – one of the leading agencies for technology and design  has for some time now been hard at work developing applications for mobile devices such as data glasses.

In association with Mediaplus and Serviceplan, Vectorform has become the first company in Germany to develop an AR vehicle exploration tool. Launching on the BMW i8 hybrid supercar,  the Augmented Reality technology powered by Metaio (SDK) combined with Google Glass allows users a first-hand look at the super-cool BMW i8 like never before:

“While the social acceptance often lags behind in new technologies, companies need to seize the opportunity early in order to remain competitive. The Google Glass technology here makes use of new approaches and perspectives for marketing, service, production and sales, “said Alissia Iljaitsch, Executive Director Vectorform EMEA. “Pioneering new technologies are often the automobile manufacturers. In the sale of modern vehicles, the revolutionary glasses offers exactly to the important framework to highlight specific features into focus and to emphasize the innovation of the vehicles.””

vlcsnap-2014-07-24-11h55m28s25

Identify internal and external values ​​with Google Glass

When promoting a vehicle to new buyers, companies often only have a short window of attention to present new and unique features. As such, innovative communication methods are always valuable. Google Glass combines the actual car exploration experience with the digital world in order to offer prospective buyers a completely new way of seeing the i8.

Here’s the application at a glance: Distributed around the vehicle are various “touch points” where Google Glass recognizes the contours of the car using Metaio 3D tracking technology. The first touch point demonstrates the new laser headlights by projecting virtual light beams right out of the car’s lamps. Subsequent touch points  illuminate the inner workings of the chassis or illustrate drive-train components with an  “X-ray” view. In addition, the aerodynamic shape of the vehicle is demonstrated using a virtual wind current that smoothly curves and swirls around the car’s bodywork. This “augmented reality” experience accompanies the buyer in a special way during the customer journey and delivers an innovative sales tool for an equally innovative vehicle.

Framework thought as an important element

During the development of the Google Glass App, the implementation of an appropriate content management system played an essential role. This allows Vectorform to easily update and support content (such as images, videos and texts) directly by the company itself. This ensures the application brings added value on an ongoing basis, rather than being a one-off demonstration.

We would like to congratulate Vectorform and their partners on this excellent application. We are very proud that our software encourages developers and creative agencies like you to create such great Augmented Reality!

vlcsnap-2014-07-24-11h54m10s38


Doing Magic with Augmented Reality… for real!

July 17, 2014

A guestpost by Andreea Raducan

If you could do magic, what would be your favorite trick? Well, some might say: “let’s make money!” So too did Simon Pierro  with his “How to get rich…!” magic trick, which he is now sharing with everyone in an amazing video proving that it is truly possible!

What is the mystery behind the “AR Money”?

The key ingredients are: a brilliant mind, some Augmented Reality “magic”, plus a sprinkle of “real” magic. From these three elements, the AR part is perhaps the easiest one. Since Simon is already equipped with the other two, all he had to do was to contact Metaio for support with the Augmented Reality part.  It all started in 2010, when he first discovered AR and its potential. At a time when AR was known only to a few, Simon found Metaio: a team he describes as “very flexible, creative and open to new ideas”, with extensive knowledge and experience in Augmented Reality. Together, they built the “Christmas show ”. “We had lots of fun” he said, and “stayed in contact for a long time”. According to him, “Metaio is very reliable”, both in terms of technology and as partners. This is why he decided to contact Metaio again in 2014 – to “be creative together”.

Where is the boundary between Augmented Reality and the real magic?

You probably noticed the hand-drawn dollar bills in the video. The role of AR in this trick is to digitally recognize (or “track”) these bills and “transform” them into realistic looking ones on the display of the device. One remarkable feature in this tracking process is the stability and robustness of the rendering: the digital banknotes follow precisely the movement of the physical ones under the tablet.

However, the “real” magic happens afterwards, when the physical, hand-drawn banknotes are transformed into real physical ones. Brilliant, isn’t it?

trial 2

So how does he manage to do this? You might wonder.

Well, as much as I wanted to find out myself, I could not convince him to reveal his secret. What I did manage to do, is to discover how ideas come to his mind. Unlike others might think, “Inspiration doesn’t just come out of the blue”, Simon says, “You need to concentrate on what you do and look for opportunities around you, never stop thinking! Then, when you have an idea, you just have to believe in it and work hard to make it happen. You need to make efforts to find that idea and then to bring it to life; it will not happen magically.”

Thus did the “How to get rich..!” idea become reality. When Simon discovered AR, he became “absolutely fascinated” by the technology. “The factor that AR and magic have in common is that they both make things appear and disappear”. The question for him was, how to bring these two together and make people believe in both. He was convinced that such a combination would open endless possibilities. He did find a way, and what an incredible way, isn’t it?! Simon just loves to be creative and work with interesting people.

1st

Simon also loves the freedom of pursuing his ideas, turning visions into reality, while interacting with so many people. He loves seeing the impact his work creates – the reactions he draws from people. Seeing how they relate to his work, “with tears in their eyes”. “Moreover, it is so interesting to see how people from different countries express their emotions and impressions differently!”

I asked Simon what he is doing with all the money. He laughed saying “they all go with my illusion”. When asked about his future ambitions, Simon says “I don’t have plans for the next 10 years. Things are changing so much! For example, 5 years ago I had no clue about Augmented Reality. Who knows what the future will bring?  You should really stay flexible and open.”

For the time being, we are about to witness one of his dreams come true: performing live in front of large public audience. This is something he wished for a long time and which is going happen, as he will be performing his magic tricks live in Mannheim  (October 28) and Munich  (October 29). If you want to see the magic for yourself, this is your chance! Until then, discover more on his website.


Thermal Touch: The Future of Wearable User Interfaces

June 3, 2014

GAL-thermal-touch-004Have you ever worn a wearable headset? Have you tried Google Glass or any of its myriad competitors? After the initial (and deserved) sense of wonder and awe wore off from perceiving digital and virtual content overlaid on the real world, have you found yourself strangely frustrated at just using the device itself?

You wouldn’t be alone – smartphone companies (Apple chiefly among them) have labored diligently to irrevocably addict you to touchscreens and touch interfaces. Not unkindly, touchscreens have largely replaced mobile keyboards and are largely to thank for the meteoric rise and massive adaptation of smart devices. But insidiously secreted away amid marketing language and shiny rectangles is the sentiment that gestures like “pinch-to-zoom” and “swiping” are only natural- if not pure instinct.  GAL-thermal-touch-003

Imagine an iPhone- without a touchscreen. Imagine a tablet, and no amount of swiping or pinching will allow you to manipulate its contents. This is the reality of wearable computing and augmented reality devices – they’ve removed the necessity of touch. But then how to use an application more than passively? How to navigate to a different screen?

chess-5885Wearable augmented reality devices rely on vision to display content. There are already forays into voice navigation (along with infuriating buttons and swiping motions on the glasses themselves; some clever companies are utilizing Low Bluetooth Energy to pair companion smartphones or new devices like the Enimai “Nod” companion “ring” to activate in-app features. So what then – projectors, 3-D cameras for “finger tracking”? It’s hard to imagine a future where everyone is wearing AR glasses while obnoxiously yelling commands and waving their hands around in front of their faces or furiously trying to dial phone numbers on their hands.

Okay- so it’s not that difficult.

But what if we could bypass all of that? What if we could use camera technology to get even cleverer with reality interaction? Enter Thermal Touch – a technology that will enable interaction with nearly any object or surface.

Thermal Touch – Turning your whole world into a touchscreen

Pitch_picThermal Touch is a radical new approach to wearable headset graphical user interfaces (GUIs). It utilizes infrared cameras to register and track minute thermal imprints left by the heat signature of a finger. Touch your desk – you’re leaving imperceptible (and impermanent) heat maps each time your finger touches the surface. Combining a thermal camera with a normal camera, and developing AR tracking in conjunction with thermal heat tracking, Metaio can now turn anything into a touchscreen.

Trak Lord, Head of Metaio US Marketing, sat down with Daniel Kurz, lead Metaio R&D Engineer and creator of the Thermal Touch prototype, to talk about the future of human-computer interaction.

Daniel_KurzWhere did the idea for Thermal Touch originate?

It was happy coincidence that we got our hands on a thermographic camera and played around with it in the lab. Our R&D team had already been tasked with developing natural and intuitive ways of interacting with Augmented Reality applications when using head-mounted displays. After measuring the temperature of my coffee mug and my display, and after discovering interesting temperature patterns in my face, I noticed that wherever I left my hand resting on the desk, residual heat would become apparent in the thermal image. Brief experiments with different objects showed that this is not a unique property of my desk but most objects exhibit warm spots after touching them. The camera module further included a visible light camera, which allows recognizing and tracking objects in its field of view. Putting one and one together, this is how the idea arose that the combination of detecting touches in the thermal image and detecting and tracking the touched objects in the visible image would enable a natural way to interact with those objects and digital information associated to it – particularly for wearable headsets.

Can you describe how we built the prototype?

Our mobile prototype is based on a tablet PC to which we attached a combined thermal and visible light camera module. The fixture is simply a joist hanger I bought at the next do-it-yourself store. Our proof-of-concept software implementation is based on the Metaio SDK and therefore features the latest tracking capabilities for dealing with both planar and three-dimensional objects. It further provides the functionality to render virtual objects registered with the tracked objects. We had to extend the Metaio SDK to support capturing images from the thermographic camera and I developed a prototypical touch detection algorithm. All in all it wasn’t really that much work, because most pieces already existed in our SDK. The last thing to do was then creating some exemplary applications to demonstrate the versatile opportunities this technology offers in different use cases.

Ideally, what will Thermal Touch look like in the future? How many years are we from embedded infrared cameras?

This new way of interacting with Augmented Reality is clearly meant for wearable computers and head-mounted displays. These devices become increasingly important not only in the context of Augmented Reality, and as they do not have touch screens and they leave the hands free, our technology is a perfect fit. We keep working on improving our prototype in terms of robustness and latency and we are looking into how this fundamental approach can allow more advanced interaction techniques. For example, touching an object with different fingers might have different effects. Of course, it will take a couple of years until the first head-mounted devices will include a thermographic camera. But the current trend clearly is that these cameras become available at a small form factor and an affordable price. A mobile phone add-on enabling mobile thermal imaging will become available this year, and this is only the beginning. Once wearables are really being used ubiquitously, their hardware should be ready for Thermal Touch.

Though it may be years ahead in the future, embedding infrared cameras into wearable computing is not beyond the realm of possibility, especially in an industry that is still iterating on form factor and hardware, let alone the ideal graphical user interface.


Follow

Get every new post delivered to your Inbox.

Join 4,930 other followers