Intelligent computing meets augmented reality: Brutus guides you through the Ohio State’s Campus

May 14, 2013

As promised last week on our junaio Blog, we decided to provide you with more information on the incredible application ‘Find Brutus’ created by Bradley A. Henry, Software Developer/Engineer at Ohio State University. The app helps students to explore the Ohio State Campus through their mobile Android devices. Named after the school’s mascot, Brutus Buckeye, the augmented reality application cleverly uses voice recognition, geolocation, artificial intelligence as well as intelligent tutoring systems to provide the service to the campus community.

In an interview given exclusively for metaio, Bradley A. Henry is speaking about the idea behind, the technical facts and the ongoing working processes.

What actually is ‘Find Brutus’?

In technical terms: Find Brutus is an Intelligent Mixed Reality (IMR) application using a Virtual Tour Agent (VTA – patent pending) framework. An Intelligent Mixed Reality (IMR) application is the inclusion of Augmented Reality (AR), voice-recognition (VR), geolocation, Artificial Intelligence (AI) and intelligent tutoring systems (ITS) into a mobile accessible user experience.

In broader terms: Find Brutus is a Mobile Virtual Tour Guide designed for incoming students and visitors to the Ohio State University designed specifically as a graduate research project. A user will explore campus through pre-defined target locations on campus. As the user approaches the location 3-D Brutus will appear in mobile view as the indicator that they are approaching the first target. When the user identifies the location, as seen through their mobile device, the user will prompted with a few questions, including hints, about the target to answer. As the user answers each question they will then move to the next target. Version 2 is where VTA gets really exciting, see below.

Is it already a working application and in which context did you create it?

Yes, the beta version is complete. We used the free version of the Metaio SDK for Android. We are going to begin working on the iOS application over the summer as well as for Google Glass. Because the most exciting news is that this project is also part of the Google Glass Explorer pilot. We will be using Google Glass with the application to study student interactions and cognitive processing. The design of this version is study the affect of an incoming student to the university. The goal is to increase the students knowledge of the university campus and resources while acclimating them to their surroundings. If successful this should increase the students experience their first year attending The Ohio State University.

How many people were involved in the development on the application?

Find Brutus is the framework for my PhD thesis. But nevertheless a lot of players are involved: It is truly a community of engineers working on this application. I have counted over 80 people that have touched this project to some degree, currently over 8 of the Colleges and 6 departments at the university. To me this is what it looks like when organizations and people work together.

Included was also the College of Engineering Capstone project, which just won the CETI best in class project. This was a pretty extraordinary surprise, considering the level of competition. I have been blessed to work with some of the most amazing individuals. I was overwhelmed that the project was selected.

henry brad

Virtual Tour Guiding has been a dream of metaio for a long time now and we are sure to bring the concept of the “Augmented City” into real life. In which fields do you feel a real value for VTG?

Long-term objectives for it is to use the framework as a mechanism for navigation of locations, and buildings and as an educational device that will include simulations that work within real-world environments. Example, educate professionals or students, such as nurses or doctors in their work environment. How this would work, using an emergency room, fully equipped, a learner would wearing a pair of Google Glasses and would be required to resolve simulated problem-based scenarios that interact with the environment. A doctor would interact with a virtual nurse using to perform surgical procedures.

Could you think about other examples of using AR and Virtual-Tour-Guiding in educational environment?

Another example could be the Thompson Library Foundation Stones Tour (Submitted idea in the OSU AR Hackathon): A student would view the foundation stones through their mobile device. The stones would provide information, such as origin, ethnicity, world regions, and text and voice translation capability. The student could tour the stones, seek information on specific origins, request information such as publications, videos, research that is available through the OSU library or tour the library. An additional concept idea submitted by the University Archives includes a historical view of the campus through time. Using AR a user can explore the campus in any given decade using Glass. Example, you can be on the oval wearing Glass and prompt the VTA to view the campus in 1850. As your line of site moves images, and information, of the campus in 1850 would appear in the glasses.

What future projects are already in the line?

We are also putting on a one day display at the Columbus Museum of Art, fellow students, the Ohio Film Commission and Columbus Fashion Week will are also participating. We are currently discussing a fall project. In addition, we are also discussing a live DJ event with augmented reality later in the year. I work with a very creative group of people.

Thank you Bradley for the interview!

Website of EduTechnological

Version 1 capabilities:

  • Mobile accessible
  • Augmented reality
  • Geo-location notifications
  • Includes the first approved 3D Brutus Buckeye through OSU licensing.

Version 2 capabilities:

  • Voice activation (Siri type functions)
  • AI/Intelligent computer interactions (Collaboration with the University of Memphis FedEx Institute of Technology)
  • Geo-location direction service (ask for specific outdoor directions from your current location)
  • First scanned 3D Human Agent
  • Web Accessible Agents (Collaboration with the University of Memphis FedEx Institute of Technology)

Follow

Get every new post delivered to your Inbox.

Join 4,831 other followers