Frame_596(0002).png
Neurofeedback Immersive Meditation for Hyatt Hotel
4.5★ pilot-tested immersive meditation app designed for Hyatt Hotels.

Hyatt was reimagining luxury wellness, not as something fixed, but as something responsive and personal. They weren’t just asking how to deliver calm, but how to listen to it. In collaboration with Hyatt teams in Shanghai and Hong Kong, I designed a neuro-responsive meditation experience generated by EEG brainwave. As the system senses states like anxiety, focus, or fatigue, it gently adapts, shifting pace, rhythm, and feedback in real-time. The result wasn’t a meditation track. It was a quiet, responsive interaction, designed to move with the guest, not just work to them.
Client & Duration

Hyatt Hotel (Shanghai & Hong Kong) | 2024 Nov - 2025 Apr (6 Months)
Scheduled expansion to Hyatt Seoul and Tokyo in 2025.

Type

Freelance work, Immersive AI-powered meditation service,
Biometric interactive experience, Interaction Design

Team

1 Product Designer, 1 Director, 2 Developers (H5 / iPadOS),
3 Product Developers, 2 Marketers
What I Did

As a Product Designer,
  1. Led end-to-end product design for an AI-driven, brain-responsive interactive user experience.
  2. Designed interactive onboarding flows tailored to each guest’s cognitive state
  3. ​​Visualized brainwave data through intuitive, generative graphics 
  4. ​​Iteratively tested and refined the interaction design across EEG hardware, AI-driven behaviors, and responsive UI
  5. ​​Aligned with Hyatt’s design language to ensure brand consistency and tone

CHALLENGE


Static spaces but dynamic minds. The meditation disconnect

Meditation has always been personal. However, luxury hotels worldwide, wellness spaces all looked remarkably similar, pristine rooms with ambient music, dimmed lighting, and guided meditations played on loop. These experiences were beautifully designed but completely static. This one-way experience created a fundamental disconnect between the guest's internal journey and their external surroundings. Hyatt recognized this disconnect, too. As we observed guests in traditional meditation spaces, the challenge crystallized, "Can we design calmness, not as a preset, but as a dialogue?" The true challenge wasn't technical implementation, but creating something that felt natural, A system that doesn’t just play a track, but plays with clients.

APPROACH & SOLUTION


Designing meditation expeirence as a conversation with guests in real-time

Stillness is a feeling. But to design it, we needed data, rhythm, and empathy. I started by 
asking how the mind actually shifts during meditation. Using EEG sensors, we listened to real-time brain activity and mapped those invisible changes into visual, sonic, and environmental shifts.

The result was a meditation flow that adapts as guests go. Visuals deepen when focus rises, pacing slows when stress peaks, and AI modulates each session to meet the guest's current cognitive state. Rather than creating a single meditation sequence, I designed a responsive system, one that senses the mind and choreographs the experience accordingly.
1) iPadOS Version: A premium in-room experience available in Hyatt’s suite rooms.
It integrates with the EEG headset, allowing real-time brainwave input to personalize the meditation 
based on each guest’s cognitive state.

2) H5 Version: Designed for broader accessibility across Hyatt’s global wellness centers.
Guests scan a QR code to access the experience on any mobile device (Fully responsive for all smartphone types).
It offers curated meditation sessions selected by AI based on the guest’s chosen mindset.


​​
IDEATION & ITERATION


1. Designing two flows. One for deep immersion, one for global access

A. EEG-integrated flow  (iPadOS · Hyatt suite rooms)

For the EEG-based version, the challenge was to deliver real-time feedback without disrupting the very stillness we aimed to design. I explored ways to translate invisible brain signals into a UI that feels ambient, not analytical. These early wireframes mapped onboarding, signal syncing, and meditation paths that adapt in real time. Throughout, I focused on trust, clarity, and softness, so the tech would feel like presence, not intrusion.
Frame_744.png
Early iteration: Signal syncing interface and onboarding layout
Group_503.png
Refined iteration: Ambient visuals and confidence-based recommendation flow
B. QR-based mobile flow  (H5 · Global wellness centers)

This version was designed for broader accessibility, no EEG headset required. I approached it with mobile-first UX principles, ensuring a smooth and lightweight flow for users scanning QR codes throughout Hyatt’s wellness spaces. Instead of brainwaves, users select their current mindset, and AI curates a session to match. The focus was speed, calm, and a sense of agency, no tech friction, just entry into mindfulness.
Frame_743.png
Early iteration: Wireframe for mindset selection
Group_504.png
Refined iteration: Wireframe for AI curation based on mindset selection
2. Brainwave visualizer, a visual language for the mind

Beyond UX, I sketched a visualizer to reflect each brainwave’s desired mindset, not to show all 5 different brainwave components, but to just let guests feel them. I was thinking, calm is not reading. So, instead of thin analytic lines, I sketched with shifting colors and motion. Each wave translated into a pulse of light, a shift in hue, a rhythm in space. Instead of reading data, I wanted users to feel it. I mapped how color would move, change, and pulse, creating a system where mental states could be sensed intuitively, like mood lighting for the mind. It was about making the invisible visible, and designing trust through feeling.
Frame_7(0010).png

Initial sketch for turning brainwave data into feeling with flowing color and movement

KEY FEATURES

Real-time EEG & AI integration for personalized meditation
Frame_750(0002).png
A. Real-time brainwave analysis by AI with interactive UI 

Before each meditation session, the system conducts a brainwave assessment using the EEG headset. Based on the live data, AI generates a short explanation describing the user’s current cognitive state, such as “mentally fatigued” or “deeply focused”, and also recommends a personalized session to restore balance.
Frame_751.png
1) Headset connection interface
I designed the syncing screen to feel calm and deliberate. The centered headphone icon offers a quiet sense of stability,
anchoring the user before anything begins. Feedback is soft but clear, building trust at the very first touchpoint.
Frame_751(0004).png
2) Sensing process interface with real-time visualizer
During the sensing phase, I replaced metrics with motion. So, the background visual responds to brain activity in real-time.
 This visualizer wasn’t meant to inform, but to feel like presence, turning invisible neural rhythms into an ambient layer of self-awareness.
Group_612(0001).png
3) Assessment of personal results & mindset selection interface
I treated the AI’s feedback not as a system output, but as a personal reflection. Language was tuned for clarity and warmth, designed to feel like it’s noticing, not judging.
The layout guides without pressure. The layout gently guides without pressure, allowing users to arrive at their state rather than declare it.
B. AI-curated sessions based on user-selected mindset

In this version, AI plays the role of a contextual curator, guiding users based on their selected mindset rather than biometric input. After scanning a QR code placed throughout Hyatt’s wellness spaces, users are asked to select their current mental state (e.g., excited, relaxed, stressed, tired). Also, it offers a lightweight and accessible experience, optimized for any mobile device.
Frame_751(0006).png
1) Smooth transition interfaces between AI’s curated response and the selection
Each choice triggers a softly animated transition into a personalized session. The interface is light, and the UI feels like fluid and emotionally expressive.
Designed mobile-first, maintaining clarity, responsiveness, and a sense of calm at every touchpoint.
FINAL DESIGN


Brings together real-time biometrics with visual systems designed to mirror mental state across both platforms.

My goal was simple. Just making it feel natural enough to fade into the flow, and thoughtful enough to matter. Every detail was shaped with care to reduce friction, avoid noise, and create a space that feels genuinely usable, not overwhelming, not clinical. From syncing the EEG headset to selecting a mindset, each screen was crafted to support focus and trust, especially in moments when users might feel distracted or emotionally vulnerable. I didn’t just design screens. I designed transitions, tone, and meditation rhythm so that the experience felt supportive and easy to stay with.
Frame_753(0001).png
iPadOS Final UI - AI generated result screen based on EEG analysis
Frame_754.png
H5 Final UI - AI curated sessions based on user-selected mindset





🔍 WHAT I LEARNED

I felt designing for the mind is super delicate. Even small things, like the tone of a sentence or the rhythm of a transition, can affect how safe a user feels. One tester said, “I don’t want an app telling me I’m anxious.” That made me rethink how technology should speak. It wasn’t just about being accurate, but it was about being respectful. I learned that clarity isn’t just aesthetic, it’s how immersed in emotion. And with AI, it’s not about showing intelligence. It’s about knowing when to step back. So the system worked best when it felt more like a suggestion and gentle guide than a decision.