MindScape AR – MVP Prototype Reflection (Task 3)

MindScape AR – MVP Prototype Reflection (Task 3)

Project: MindScape AR – Campus Mental Health Support

Student: Wang Zilong (ID: 0361141)

Module: Experiential Design


1. Introduction: Designing for Emotional Safety

The core ambition of MindScape AR was never just to build an AR application, but to craft a "mental wellness experience" that prioritizes emotional safety over technical spectacle. For Task 3, my objective was to develop a Minimum Viable Product (MVP) that could validate whether a simple, AR-based meditation flow could effectively induce calm without overwhelming the user.


Unlike traditional games or utility apps where "engagement" often means high-speed interaction, here, the goal was the opposite: Emotional Clarity. The MVP needed to prove that we could use augmented reality to create a "slow interaction pacing" that reduces cognitive load rather than adding to it.



2. Defining the MVP Scope: The "Less is More" Strategy

To ensure the prototype remained feasible within the Week 6-10 timeframe, I intentionally limited the scope to the core interaction loop. I realized that for a mental health tool, feature bloat could be detrimental. Therefore, the MVP focused exclusively on:


  1. Surface Detection: Reliable grounding of digital objects in physical space.

  2. Anchoring: Placing a visual "meditation anchor" (e.g., a glowing orb or tree).

  3. Input: Letting users externalize their feelings via an "Emotion Selection" interface.

  4. Feedback: Translating that input into a bio-rhythmic breathing exercise.

This constrained scope allowed me to focus on the quality of the silence and the motion, ensuring the prototype demonstrated "feasibility without overloading the system or the user".



3. Technical Journey & Challenges (Unity + Vuforia)

Developing the MVP in Unity using Vuforia presented specific technical hurdles that directly impacted the user experience.

The Ground Plane Challenge

The foundation of the experience relies on Vuforia Ground Plane detection. Early testing revealed a critical vulnerability: the stability of the AR anchor was highly sensitive to environmental factors.


  • The Issue: In low-light environments (common for students studying late in dorms) or on reflective surfaces, the tracking became unstable. This "jitter" broke the immersion, turning a calming moment into a frustrating technical debugging session.

  • The Solution: Rather than just tweaking code, I approached this as a UX problem. I implemented clear on-screen instructional UI to guide users to find better lighting and angles. This "guidance" became part of the ritual, slowing the user down before the meditation began.

The Synchronization Problem

Another challenge was ensuring that the breathing animations synchronized perfectly with the color transitions across different mobile devices. Performance drops caused timing inconsistencies, which ruined the rhythmic "inhale/exhale" flow.


  • The Fix: I replaced complex animation logic with simplified animation curves and gradient-based color interpolation. This optimization not only reduced the performance overhead but ensured a buttery-smooth visual transition that felt organic rather than mechanical.
    +1


4. Design Psychology & Iteration

The most profound insights came from iterating on the "feel" of the application.

Designing for "Mei Ling" & "Daniel"

Keeping my user personas in mind—specifically Mei Ling (the overwhelmed achiever) and Daniel (the isolated international student)—guided my aesthetic choices.


  • Color Palette: I utilized Green (#2D8659) for growth/healing and Blue (#1E5A8E) for trust/stability. These weren't random choices; they were selected to subconsciously signal safety to users in high-stress states.

  • Pacing: Initial tests showed that my animation speeds were too fast. Users found them distracting rather than soothing. I had to significantly slow down the animation curves to match a resting human breath rate, reinforcing the goal of "designing for emotional comfort rather than technical complexity".

From Complexity to Calm

The transition from a raw idea to the MVP taught me that subtlety is difficult. Translating a user's selected emotion (e.g., "Anxious" intensity 8/10) into a visual response required careful tuning. The system adapts the breathing rhythm and ambient colors based on this input, creating a personalized loop that validates the user's feelings.



5. Key Learning Outcomes & Future Direction

This project has been a pivotal lesson in MVP thinking—understanding that a prototype is not about "features finished" but about "hypotheses validated".


What I Learned:

  1. Environmental Empathy: I now understand how real-world conditions (lighting, surface texture) dictate the success of AR apps.

  2. Emotional UX: I learned that in wellness design, the interface must whisper, not shout. Every micro-interaction affects the user's sense of safety.

Future Roadmap:

Moving toward the final submission, I plan to:

  • Enhance Robustness: Further fine-tune Vuforia parameters to handle "messy" real-world student environments.

  • Deepen Personalization: Introduce more nuanced emotional states and varied breathing patterns to cater to different types of anxiety.

  • Refine Visuals: Polish the AR assets to increase immersion while strictly maintaining the minimalist, calming tone established in this MVP.


Project Links:

  • Figma Prototype: https://www.figma.com/make/vMGHB3gQWl79oNSgqcZZXh/AR-Meditation-Environment-Builder

  • Video Walkthrough: https://www.youtube.com/watch?v=WQ3zje_J_i0&feature=youtu.be

  • Unity/APK: https://drive.google.com/drive/folders/1dQtpOn0IN5j-WmB537c0UjPBA0YhQlrF?usp=sharing

Comments

Popular posts from this blog