Out of Body

Behind the Scenes

Mthokozisi "Hap" Sibanda

Background

This experience is a play on the phenomenon of having an out-of-body-experience. You might have seen this play out in movies featuring the Marvel character, Dr. Strange. In this AR experience, you open your mouth to expel your ghost. This interaction is inspired by a scene from the popular anime, Dragon Ball Z. 3D assets are prepared in the 3D software, Blender. The AR implementation is done in the AR creation software, Meta Spark.

Dr. Strange pushing out the spirit of Spider-Man in the movie: Spider-Man: No Way Home
Gotenks breathing out a ghost in the anime: Dragon Ball Z

Preparing Assets

In this experience, we want to show a ghostly version of the user's face. So in Blender: we create a ghost by using a 3D head model as a base.

Ghost model next to 3D face mesh reference

Once we have our ghost model, we need to be able to animate it. So we place deformation bones onto our ghost and use them to create the following animations:

These animations are created such that they can be played sequentially in our AR project.

By using an accurately sized face model as a reference, we can clearly see how the animations will look in the AR project. Once we are satisfied with the model and animations, we export our ghost as a 3D object file.

Animation of the ghost flying out of the mouth in Blender (the blue highlighted objects are the deformation bones)

AR Implementation

In Meta Spark, we import the ghost and setup our scene such that we have our animations playing as we mocked them out in Blender. We then split the user experience into two phases:

Intro Phase

When the user first opens their mouth, we begin by playing the animation of the ghost flying out. During this phase, the ghost model's position is anchored to the user's head. We also gradually fade out the camera feed and replace it with a background image.

Main Phase

Once the flying animation completes, we unanchor the ghost's position from the user's head. We then play the settling animation while also moving the ghost to the center of the user's screen. After this animation completes, we allow the user to control the ghost's on-screen movement by using their own head as a joystick.

Taking time to move the ghost to the center of the screen is important because we want to make sure that the user can see where the ghost is before they're allowed to control it. This is because it is possible for the ghost to fly off-screen during the Intro Phase depending on the user's head position and orientation relative to their phone.

Intro Phase transitioning into the Main Phase (notice how the ghost is initially anchored to the user's face)

Finally, we implement an optional mini-game that the user can play during the Main Phase. This gives the user something to do while controlling the ghost. However, some users might prefer to just move the ghost around aimlessly. So, we implement an option to let the user choose which version of the experience they want to use.

Optional collection mini-game being played

Flow diagram of application logic
Try the experience yourself