How to Build An Augmented World

mook-e
Escaping from reality has been a dream for centuries.

Storytelling, publishing, cinema, video games and more are all part of the same objective. In previous posts, we’ve discussed digital’s latest contribution to this goal: Augmented Reality (AR). Here, we’ll take a look at some of the technical opportunities and challenges of this new development medium.

Augmented reality?

The Augmented Reality process starts in the real world by filming and capturing data. After information analysis, a computer adds virtual elements and displays the result on your screen. The new, augmented video is called a mixed world: it’s a space between the reality and the virtual world.

particles

This technology can be used on many platforms: on websites (Transformer Experience, Weet-bix etc.), in iPhone applications (Metro Paris Subway, Yelp). Even printed books are experimenting with the technology: some really nice work has been done with the Magic Book. Though the technology driving it is 10 years old, AR has just begun to be incorporated into commercial projects. By mixing our tangible, every day world with a virtual one, AR brings useful creative possibilities for everyone.

Building an AR World

With four other classmates at Les Gobelins, I worked on an AR project called Mook-e. The project was an immersive interactive installation where the user discovers a secret world inhabited by a strange moon creature.  Apologies for the videos being in French, but you can still get the gist of how the system works.

Rather than be content with a visual illusion, we also wanted to add a tactile element, so the user can physically touch and handle the Mook-e. To accomplish this, we used the textured sphere for the Mook-E creature, as you can see in the demo video.

To make it all work, we had to deal with lighting, tracking, time, and the user’s path. Since the user is free to move about inside the installation, we had to carefully trigger sequences and events.

Information you can track

The main issue with creating an AR application is deciding what objects the computer will track. Is it the user’s location like Metro Paris Subway? Is it a page, like the Magic Book? Or is it a face, like Transformer Experience? Depending on the answer, it implies different solutions / questions. For example, each operates on one of two different detection modes: passive or active.

Passive mode

The detection part is how the computer finds what it’s looking for in each image: faces, patterns or pictures. A powerful computer and good camera are important to make sure the tracking will be fluent and the delay between the camera’s image and the end display is as short as possible. There are several solutions already available to developers:

  • FLARToolkit: Mostly used to create websites, this Flash library is based on ARToolKit (the C lib). It uses a black square for the detection and a pattern inside to differentiate different markers. Unfortunately, this technology loses track of objects very easily.
  • D’Fusion: Total Immersion’s software which works with real pictures, faces and 3d objects. To track objects their application creates a recognizable signature based on interesting points so even if you hide a part of the object, it’ll still work. With their technology, you can easily hide a part of the tracker without losing it. Currently, their software is used mostly in interactive installations but they’re bringing it to the web with various projects like the previously mentioned Transformers Experience.
  • image4_1

  • PTAM: The process seems to be the same as D’Fusion with one important difference: the software is free. So don’t hesitate to try it!
  • That’s all? Not at all! Companies like Sony are already providing libraries (Vision Libraries) to add AR inside games using the EyeToy.

Active mode

Compared to the passive tracking, active tracking is faster, more powerful and allows you to perform more complex augmented reality. It works with captors, GPS, compass and sensors. That’s mostly why the iPhone is seeing such a surge of AR applications: Apple provides all assets necessary to locate a user in space. All you need to do is to use their geo-coordinates to display what you want. Active AR is mostly used for mobile devices, amusement park and interactive installations. Unless you have interfaces and code samples, using the active mode takes definitively more time and costs more.

Possibilities

Augmented reality can be used almost everywhere all you need is to work on your creative muscle: In laboratories to display information, in amusement parks, in books, in video games, this technology has utility everywhere, marketing included!

How we Implemented Mook-e

We used Total Immersion’s technologies. D’Fusion provided enough functionality to create a real story around our virtual meeting with the Mook-e. As previously mentioned, the D’Fusion software recognizes a portion of the marker so this choice allowed us to track huge elements such as a carpet, a table and paintings on wall as well as smaller ones like our 3D textured sphere (the Mook-e). In order to reinforce the user’s experience, we used glasses with one LCD screen for each eye and an HD Camera in the front so you keep your personal vision of the scene. That way, the users didn’t have to watch a separate screen close to you, creating a more immersion experience.

image1_2

User’s feedback

We discovered that users experiencing the installation are really involved in the virtual world and really don’t pay much attention to the realism. For example: nobody noticed that turning the Mook-e sometimes hides parts of your fingers. We noticed the experience loses realism anytime the software bogs down and the user loses the sense they are directly experiencing the scene. Delay should be avoided as much as possible by optimizing your 3D objects (removing shaders, reducing the polygons, simplifying textures).

What’s next?

More exciting technologies are on the way, especially tangible holograms. It’s an evolved AR application in which you are able to touch objects which only exist inside the virtual world. To implement this level of tactile feedback, developers used a system based on ultrasound. The sound produces a pressure, making you feel something. Few models exist in the world; the famous one is the “Airborne Ultrasound Tactile Display” created by Provision Interactive Technologies.

Other projects to check out!

Uses of augmented reality are exploding with better implementations appearing every day. The video game industry is clearly in the course. Here are two projects worth checking out, one by Sony, soon in stores, and the other by Microsoft:

EyePet of Sony
Natal of Microsoft