This week we offered once again an Augmented Reality museum walk through together with the Bavarian National Museum in Munich and were overwhelmed by all the attention we got afterwards via different blogs (like in the Huffington Post Germany )and in the social networks. This attention shows that the topic of Augmented Reality is really interesting both for the museums themselves and also for the visitors. Therefore, we decided to give you a more technical insight into our museum project and asked our developers to talk about the development of our application. Here it is:
Hello, my name is Kevin and I am a member of the creative team here at Metaio. I worked together with my colleagues Alexei and Nicolai to create the assets and code for the DLD Bavarian National Museum AR experience.
Built in a very short time, the Bavarian National Museum application was a great opportunity to create a valuable cultural experience with AR technology. It aimed to create a balanced user experience between physical and digital content that could inform and entertain the average museum visitor. We focused on five pre-selected exhibition pieces and unveiled them as part of the DLD Conference in Munich.
The Metaio Toolbox: Easily creating 3-D tracking maps
The first step to develop the project was to visit the museum itself in order to come up with some ideas and generate our 3-D tracking maps with the Metaio Toolbox . Many of the assets would be created off-site so establishing an accurate reference that we could use back in the office was important. This was easy to do with the Metaio Toolbox, and within 2-3 attempts we had a satisfactory 3-D tracking model that we could later use in the Metaio Creator . We could also load our point-clouds into Autodesk products for designing more complicated 3-D content by extracting them as an OBJ file from the Metaio Creator.
During our discussions with the museums curators there was concern that the new digital content would take attention away from the physical artifacts, so we set out to create a design that would keep a respectful balance between the cultural artifacts of the museum and the new digital information we were introducing into the environment. To achieve this we kept AR content to the sides of the physical objects, used discreet 3-D white lines as indicators and semi-transparent backgrounds for our buttons and texts.
The AR pieces in detail:
In the Mary Magdalene scenario we introduced to the user an audio explanation of the piece, some general background information and a photographic overlay that showed the sculpture’s past place of residence (a church altar that was color-corrected in order to better match the lighting scheme of the room).
In the Judith scenario we connected pieces of explanatory text to the model with 3-D white lines. Supported by a particularly strong 3-D map, the experience provided a great sense of depth and space to the user without distracting from the physical object. In order to ensure that the lines were a pleasant shape and length we imported the reference point cloud into Maya before constructing the 3-D lines.
The Munich city model was a real challenge to us because the lighting conditions in the room were very difficult. Due to the sensitive state of some of the historical pieces, strong lights were not allowed in this particular room. This meant that getting a good 3-D map and lining up content to the physical model involved a lot of trial and error. But we managed, and in the end visitors could see an overlaying map of today’s Munich.
The Moor’s head cup contained three nice reference images of the interior and base of the cup which were not viewable to the visitor. In order to display them and not take away from the physical model we created a thumbnail effect that shrank and grew the images when the user tapped on them. This was done by overlaying the images on to a 3-D object and adding a simple on-click animation in the Metaio Creator along with an additional piece of code to allow for a secondary on-click animation.
In the case of Flying Mercury we displayed large images of other artworks created by the artist. They are “floating” around the sculpture.
Balancing AR and non-AR content
AR tablet experiences have a short viewing time in comparison to other media due to the energy required for navigating the physical space. In a museum there are many different people who are enjoying the exhibition at different paces. To create a more fulfilling experience, we needed to support each AR scenario with a non-AR content section: something people could easily switch to while sitting down and relaxing. This non-AR section contained text, audio and video and was accessible through a button on the bottom of the AR viewing screen. It was built by creating an offline webpage that was then integrated into the AR experience.
A developer’s point of view
This AR scenario was very generous from a developer’s point of view. Its value derives from the simplicity and unobtrusive way of visualizing the additional, well designed content. In this case it means to me that the design and arrangement of the content together with the stable tracking already contributes a lot to this experience. For this AR experience my colleagues finalized the concepts and designed the assets, as well as combined and positioned everything with the help of the Metaio Creator. The effort as a developer was therefore quite manageable.
In the end, the AREL Technology enables you to easily create slick and effective user interfaces for your AR scenarios.