
MUSO - A XR creative music synthesizer that can turn daily objects into instruments
Category:
Mixed Reality / AR,VR / Unity Development / Gesture Control
Team:
Hattie Li, Zoe Zhang, Monica Zhang
Timeline:
Mar-Apr 2023
Tool:
Figma · Unity · Visual Studio Code
My Role:
Interaction Design · XR prototyping · User Testing · Character Design
Muso Key Features
Muso‘s music creating experience can be separated as four steps:
Design Process - Ideation
How I came up with this idea?

What stop you from learning music?
My first piano lesson at six felt cold and distant—both the keys and the sound. Years later, I realized many friends had felt the same.
User Research - Interview
Started with my personal experience, I did user interviews and find that this is a widespread phenomenon. From the interview, I wanted to figure out other factors that can be the barrier for music theory learning.
How might we made music learning fun, accessible, and creative?
A scenario came to my mind immediately. A kid, who always dream to be a rock star, is using a broom to pretend to be a guitar and imagined himself on stage as a rock star. From this scenario, MUSO was born.
Prototyping
I built the demo using Oculus Quest 2 and Unity. While VR unlocks incredible creative freedom, it can feel disconnected from the real world. So we used Passthrough to blend digital UI with real objects—letting users interact through touch, while staying immersed in a new sensory layer.


Micro-Interactions
MUSO encourages discovery through play — users are invited to explore their surroundings and identify hidden basic shapes like squares, primitives, or pill forms embedded in everyday objects.
Different objects in real life have different shapes, colors and ways of interaction. MUSO convert different objects into different musical instruments according to their shapes and sizes.



Validation and Iteration
To validate our design, we tested the demo with 8 users from diverse backgrounds, including musicians and creatives. Their feedback helped us uncover several key issues in our demo.
🎵 Challenge 1
How can we make shapes trigger sound like real instruments?
To simulate realistic interaction, we mapped the object's acceleration to the sound trigger mechanism. For example, when a user shakes the ellipse shape, the program captures its acceleration and plays a sound accordingly.


✨ Challenge 2
How can we add responsive visual effects for each user interaction?
We designed and coded dynamic visual feedback for each shape. For circles, every touch triggers alternating pink and blue ripples that grow and fade out—creating animated feedback.
👋 Challenge 3
How can we map different gestures to one shape?
By integrating the Oculus SDK, we tracked finger movements and defined multiple gestures. Each gesture is linked to a distinct sound, allowing one shape to support diverse, meaningful interactions.

Character Design
Awards & Recognitions