Back to Work
Aura app interface mockups showing emotionally adaptive meditation companion
UX Case Study · 2026

Aura

Emotionally Adaptive AI Meditation Companion

Meditation apps play the same session for every user — regardless of how they feel right now. Aura changes that, adapting its guidance, pacing, and voice in real time based on emotional state.

AI Interaction Design Voice UX Behavioral Design System Thinking
Role
UX / Product Designer — Solo Concept Project
Scope
Research, Interaction Architecture, AI Behavior Logic, Experience Design
Tools
Figma, AI-assisted research, System modeling
Constraints
Concept project — no live product, consent-first data model, non-clinical scope
58%
Abandon by Day 30
2%
Real-World Adherence
90s
Critical Exit Window
The Problem

Meditation apps speak.
They never listen.

Every major app plays the same pre-scripted session regardless of how you feel when you open it. Users don't quit because they lack discipline — they quit because within 60–90 seconds, the mismatch becomes obvious.

The Problem — interaction model comparison showing one-way delivery vs emotional state as input
Research Insights

Three findings that shaped
every decision.

139+ studies. The same finding every time — when the system adapts to you, you stay.

Research insights — 58% abandonment is systemic, personalization gap is real, adaptive systems show +25% engagement
System Model

How Aura decides what to do —
and remembers what worked.

9 states. 18 transitions. 3 recovery patterns. A responsive interaction loop that adapts pacing, tone, and silence based on user activation level.

System model — state flow diagram showing Idle through Welcome, Check-in, AI Reads You, Entry, Session, Close, and Reflect states
Design Decisions

Five decisions.
Each one has a tradeoff.

Design decisions prioritized emotional attunement, adaptive pacing, and failure recovery.

Five key design decisions with tradeoffs — body-first voice prompts, designed silence, passive signal reporting, AI reasoning transparency, and failure state recovery
Process — Lo-Fi Wireframes

Structure first.
Content second.

Three critical interaction moments mapped before moving to visual design.

Lo-fi wireframes — Flow 1: Check-in, Flow 2: Session Screen (high stress mode), Flow 3: Return After Gap
Interface

Three moments where Aura
responds differently.

Check-in with AI reasoning. Ground-first mode for high stress. Gentle return after a break — no streak penalty.

High-fidelity interface mockups — Check-in + AI Reasoning, Ground First mode for stress 8-10, Return After a Break with past session memory
Metrics

Hypotheses,
not promises.

Every design decision maps to a measurable outcome. This is how I'd validate the concept.

Metrics framework — early exit rate target under 15%, 30-day retention target over 25% (7x industry baseline), session completion, emotional shift, and silence comfort metrics
Reflection

The insight that changed
the project.

This project explores the shift from static UX flows to adaptive AI interaction systems.

Reflection — The gap between visual designer and product designer is a thinking gap, not a tools gap
"The gap between visual designer and product designer is a thinking gap — not a tools gap."
— Abiola, Verum Artifex Studios

Interested in working
together?

Start with a discovery call or reach out directly.

Book a Discovery Call ← Back to Work