Arasaka Malfunction is an AR face experience inspired by the video game, Cyberpunk 2077. It was created for Meta's Cyberware hackathon. The experience lets you control a cybernetic visor and two floating drones, all of which can shoot lasers. However, your tools don't always work like they're supposed to...
The visor worn by the user is a 3D model that was provided for the hackathon. The model comes with "deformation bones" that can be used to animate the visor. When we apply this visor to the face, we map these bones to the user's head. So when the user moves their head, the bones follow the head movement and the visor deforms in a convincing way.
For additional assets, we use a custom-made drone model. The drone was modelled and textured in the 3D software, Blender. In Blender, we create an animation of the drone malfunctioning. We "bake" that animation into the 3D object file of our drone. We then import the file into our AR creation software, Meta Spark. In Meta Spark, we program a hovering animation for the drones and we also make the drones copy the user's head movement.
Both the hover animations and the visor deformation are examples of procedural animation. Unlike the drone malfunction animation we did in Blender, these procedural animations are not "baked" into the 3D objects themselves, but are instead controlled and determined at runtime using algorithms.
Next, we set up logic such that when the user taps the screen: a loading bar appears across the user's visor. This bar serves as a UI element that seamlessly blends into the scene. It also tells the user that something is about to happen and when it's going to happen. When the bar finishes loading, one of two things happens:
These visual effects are created using a combination of special effects programming (i.e graphics shaders) and animated particles (using Meta Spark's particle effects system).
Finally, we implement application logic for the following user flow: