A Mixed Reality (MR) Interface system for Meta Quest featuring 3-dimensional interactions and -effects.
Mixed Reality
Spatial UI System
Objective
Create the first internal framework for Kodansha VR Lab to enable Interfaces that leverage 3-dimensional interactions and assets.
Client: Kodansha VR Lab
Scope
Create an engaging spatial UI that leverages the Mixed Reality (MR) capabilities. Guide the engineering team to build the system from ground up and implement it into production content.
Role: Concept, Prototyping, UI/UX, Visual Design
Outcome
A versatile UI system for integrating 3D assets and VFX, providing a cohesive interface aligned with the overall visual direction of the production project.
Starting Point
Current Extended Reality (XR) UIs
Established UIs often draw from traditional 2D designs, resulting in a "flat" appearance. Users in 3D environments interact with static panels, buttons, and menus that could exist on a conventional screen. This approach lacks immersion and intuitiveness.
Pain Points
Poor visual feedback
Lack of satisfying visual feedback during interactions, leaving users feeling disconnected. No depth-based or dynamic feedback.
Constraints of interaction
Use of point-and-click mechanics limits does not allow natural and intuitive gestures like grabbing or pushing.
Limited spatial awareness
2D interfaces create a disconnection between the immersive experience and the UI, making it feel out of place and clunky.
Low engagement
Lack of a playfulness is opposed to the sense of discovery that characterizes experiential and exploratory content, particularly gaming.
Objectives
Immersive interactivity
Create an intuitive, immersive XR interface that leverages dynamic visual feedback, natural 3D gestures, and playful, exploratory interactions.
Visual coherence
Seamlessly integrate the UI into a format that leverages the Quest 3โs spatial capabilities by blending the passthrough camera with a fully immersive environment, allowing characters to interact with the player and their space. The central theme revolves around defeating the monsters.
The experience utilized Meta Quest 3's spatial data for MR interaction
A monster bursts through the playerโs wall from an outside world during the experience
In the experience, bad-tempered monsters are augmented onto the playerโs room walls. The main character assists the player in defeating these monsters.
Interaction- and Visual Design Ideation
With the interactivity being centered around the monster character, I investigated different types of interactions through a full spectrum of iterations - from sketches to quick 3D models and prototyping in the real-time engine Unity.
Reference Color Scheme
Chinese Purple
Monster Body
#6A199E
Tangerine
Slime Color
#EF8300
Waterspout
Character & Props highlight
#AOF2F1
Rich Lilac
Prop Base Color
#C26EC4
Direction A
Morphing Blob
A sticky, orange mass, inspired by the splashes from which the monster emerges. This slime-like form morphs from a single blob into all menus, dynamically separating and thus guiding the eye through the space.
Dynamic Elements Constant separation and reuniting of elements creates a fluid interface
Guiding the eye Morphing into different states intuitively guides the eye through space.
Direction B
Exploding Buttons
A button that represents a simplified version of the monster. Inspired by the shooting sequence, the buttons explode on click and trigger a particle effect. A custom effect was created for the UI and later also used during the In-Game shooting aequence.
Spatial feedback Particle- and material effects enhance playfulness, adding a dynamic and fun element to the UI.
Direction C
Projected Hologram
Hologram projection I explored various possibilities for projecting a menu using one of the in-game props, integrating it seamlessly into the gameplay environment
User Flow
Onboarding
We explored various interactive systems and ultimately settled on an onboarding flow of four steps. I additionally developed explanatory illustrations based on a model room to enhance the clarity of the explanatory text in the UI.
The roomscale application works with the users self-scanned spatial data. To guarantee accessibility even for first time users, we decided to implemented a multi-step onboarding process. We were very considerate to keep the number of required steps as low as possible.
Room base model
“Carfully position your play area“
“For your safety, please move to an obstacle-free area of at least 2m.
Keep the area inside the circle clear.”
“Let’s practice shooting“
“These guns will help you to fight enemies during the experience. Use your index finger to pull the trigger and shoot all targets.”
“Where should Mks dance?“
“Use the rabbit marker to locate the area where Mks will be dancing. Point to the ground or an elevated surface, then pinch to place.”
“Now place the MikaMika Door“
“Use the MikaMika Door marker to locate the area where Mikasa will enter the room. Point to a wall, then pinch to place.”
Flow Diagram
The onboarding elements are displayed only during the first playthrough and omitted in subsequent sessions, as they are nested within the Main Menu. To streamline the experience for returning players, we removed these elements in later sessions, shortening the flow for frequent users.