Mixed Reality

Spatial UI System

A Mixed Reality (MR) Interface system for Meta Quest featuring 3-dimensional interactions and -effects.

Objective

Create the first internal framework for Kodansha VR Lab to enable Interfaces that leverage 3-dimensional interactions and assets.

Client: Kodansha VR Lab

Scope

Create an engaging spatial UI that leverages the Mixed Reality (MR) capabilities. Guide the engineering team to build the system from ground up and implement it into production content.

Role: Concept, Prototyping, UI/UX, Visual Design

Outcome

A versatile UI system for integrating 3D assets and VFX, providing a cohesive interface aligned with the overall visual direction of the production project.

Starting Point

Current Extended Reality (XR) UIs

Established UIs often draw from traditional 2D designs, resulting in a "flat" appearance. Users in 3D environments interact with static panels, buttons, and menus that could exist on a conventional screen. This approach lacks immersion and intuitiveness.

Pain Points

Poor visual feedback

Lack of satisfying visual feedback during interactions, leaving users feeling disconnected. No depth-based or dynamic feedback.

Constraints of interaction

Use of point-and-click mechanics limits does not allow natural and intuitive gestures like grabbing or pushing.

Limited spatial awareness

2D interfaces create a disconnection between the immersive experience and the UI, making it feel out of place and clunky.

Low engagement

Lack of a playfulness is opposed to the sense of discovery that characterizes experiential and exploratory content, particularly gaming.

Objectives

Immersive interactivity

Create an intuitive, immersive XR interface that leverages dynamic visual feedback, natural 3D gestures, and playful, exploratory interactions.

Visual coherence

Seamlessly integrate the UI into a format that leverages the Quest 3โ€™s spatial capabilities by blending the passthrough camera with a fully immersive environment, allowing characters to interact with the player and their space. The central theme revolves around defeating the monsters.

The experience utilized Meta Quest 3's spatial data for MR interaction

A monster bursts through the playerโ€™s wall from an outside world during the experience

In the experience, bad-tempered monsters are augmented onto the playerโ€™s room walls. The main character assists the player in defeating these monsters.

Interaction- and Visual Design Ideation

With the interactivity being centered around the monster character, I investigated different types of interactions through a full spectrum of iterations - from sketches to quick 3D models and prototyping in the real-time engine Unity.

Reference Color Scheme

Chinese Purple

Monster Body

#6A199E

Tangerine

Slime Color

#EF8300

Waterspout

Character & Props highlight

#AOF2F1

Rich Lilac

Prop Base Color

#C26EC4

User Flow

Onboarding

We explored various interactive systems and ultimately settled on an onboarding flow of four steps. I additionally developed explanatory illustrations based on a model room to enhance the clarity of the explanatory text in the UI.
The roomscale application works with the users self-scanned spatial data. To guarantee accessibility even for first time users, we decided to implemented a multi-step onboarding process. We were very considerate to keep the number of required steps as low as possible.

Room base model

Carfully position your play area

“For your safety, please move to an obstacle-free area of at least 2m.
Keep the area inside the circle clear.”

Let’s practice shooting

“These guns will help you to fight enemies during the experience. Use your index finger to pull the trigger and shoot all targets.”

Where should Mks dance?

“Use the rabbit marker to locate the area where Mks will be dancing. Point to the ground or an elevated surface, then pinch to place.”

Now place the MikaMika Door

“Use the MikaMika Door marker to locate the area where Mikasa will enter the room. Point to a wall, then pinch to place.”

Flow Diagram

The onboarding elements are displayed only during the first playthrough and omitted in subsequent sessions, as they are nested within the Main Menu. To streamline the experience for returning players, we removed these elements in later sessions, shortening the flow for frequent users.

Outcome

Components

The technical framework includes three asset types: button geometries, background/explanation images, and custom textures for UI particle effects.

Two main Unity scripts manage these elements: the Button script, which handles Quest handtracking, triggering dissolves and particle effects on interaction; and the Menu script, which navigates between menus.

Core Features

Custom Geometry Buttons

Enhancing user interaction by providing visually engaging, contextually relevant controls that seamlessly integrate into the 3D environment

Interactive Particle FX

Dynamic particle effects respond to user actions, creating a visually engaging and immersive experience within the 3D environment

Interactive Tutorial

Seamlessly integrating gameplay into the UI, users can wield a gun to shoot monsters. This dynamic interaction enhances blends the gameplay experience with the UI

Wrist Menu

Users access the in-game menu via an augmented wrist button, allowing for touchable controls. This integration enhances accessibility and streamlines interactions within the 3D environment.

Outlook

We created a spatial UI system with the core functions of integrating 3D geometries and reactive particle effects, providing a powerful toolkit for interactive user experiences. Its adaptability makes it ideal for internal use in future projects, where dynamic spatial interactions and immersive feedback are key to enhancing user engagement.
This toolkit has been adapted into our interal development pipeline, enabling the exploration of interactive, spatial UIs for future contents.