Perilous Passage
2025
Made in Unity
This game was inspired by when Philippe Petit walked between the Twin Towers. The two key focuses in this project were…
1 ) Designing an environment/world that implicitly taught the player the rules and goals of the game.
To do this, I leveraged the player’s knowledge of how the real world works, designing a world in which that knowledge would be useful in making assumptions about how my virtual world worked.
For example, to teach the player that falling off is bad, I made the towers really tall, and even added lava at the bottom. In real life, taking a fall like that would lead to certain death. Since my virtual world appears like reality—even if it is a little absurdist—it would be natural to assume that in this world, a cat could not survive a fall from a sky-scraper into a pit of lava.
To teach the player about the wind, I added a wider platform at the beginning so they could observe its consequences without being punished.
To give the player motivation and a goal to cross the gap, I put an escape helicopter at the end and added some policemen chasing them to get them moving.
2 ) Capturing the difficulty and precision of the real-world act of walking a tightrope.
Walking a tightrope is a very complex task involving the entire body, so I had to stylize the act of tightrope walking but while still maintaining some sort of difficulty and precision. I leaned into the intense focus demand and reaction time that is required to maintain balance. So I implemented a wind feature that the player must always account for and react to or else they fall off. I chose a simple WASD control scheme so that the movement was as intuitive as possible; that wasn’t supposed to be the difficult part.
You play as a cat who is carrying some stolen money, trying to escape from the cops. You need to cross this deadly chasm to get to the escape helicopter. It would be a trivial task if it were not for the violent winds that zip through the world, attempting to fling you from the bridge and into the certain death below.
Kitten Around
2025
Made in Unity for Oculus VR
What can a VR game do that a traditional video game cannot?
My answer: physically embody a virtual character, perhaps… a cat?
I designed the gameplay scenario to encourage the player to play like a cat. You bat around a yarn ball that moves frantically, demanding your full attention (dilated pupils and all)! When watching people play the game, I noticed they stood more alert, knees slightly bent as if they were ready to start sprinting. Their movements became snappy and precise, hopping from side to side, arms jutting out to catch a yarn ball on the run. It was not only a joy to play but a joy to watch people behave like cats, and the game quickly attracted a crowd of spectators at our showcase. People took turns, tried to beat each other’s high scores, and began to “ooh” and “ahh” at every game over and every new high score.
Rules: Your goal is to get a high score. To get a point, you must have the ball be in the correct colored quadrant when the timer hits zero. However, there is a strong wind trying to push your ball in various directions. If the ball is not in the quadrant when the timer hits zero, it will bounce upward. If the ball falls off the table, it’s game over.
(In addition, if you try to hold the ball in place together with two paws, it will slip and fly high up into the air. So be careful!)
Immersive Worldmaking
2025
Made in Unity using Steam Audio
I designed a world that I’d want to live in: a world where humans could breathe underwater.
I’ve always loved being underwater. It’s cool and calming. The sounds of the outside world fade away, becoming muffled and distant, giving way to a kind of peace that is hard to come by in a large city (where I grew up). Senses become dulled, and attention turns inward. I tried to capture this underwater escape in this soundscape.
Using Steam Audio
What mattered specifically for this project were:
1 ) The doppler effect, for the moving cars.
2 ) Sound-source directionality. For example, the duck is actually made of two sounds; the quacking fires upwards, and the splashing fires downwards.
3 ) Occlusion. This prevented the car sounds from reaching the audio listener while they were underwater. In addition, when you go underneath the bridge, the footsteps get a little muffled and quieter because they’re transmitting through the wood.
Note: The underwater muffle effect is not made using Steam Audio because I wanted the sounds that generated inside the water to also have the muffle, not just sounds that transmitted through the surface of the water.
Immersive Production
2025
Made using Unity & Reaper
We made an immersive 180° video of the performance of “Special Someone,” a duet with French Horn and Bassoon. When viewing, you have control over the volume of each source. You also can activate a stream of particles from each hand to mimic conducting.
Production
We used a first-order ambisonic mic to capture the room, and added some spot mics onto each instrument so we could blend in some of the close and detailed sound as point sources later in Unity.
Post-Production
My main role in the project was editing and mixing the raw mono and ambisonic audio so it was ready to be implemented into the Unity project. I used dearVR’s AMBI MICRO plugin so that I could monitor the ambisonic recordings binaurally while I was mixing in Reaper, and then again when I was done to export the audio in B-Format for Unity.
Made with Eva Choi, Jiawen Mao, Zengdong Peng, and Ashton Touzeau.
Ella Hebrard - Bassoon
Lily Sears - French Horn
VIME
2025
Made in Unity for Oculus VR
Have you ever wanted to craft your own soothing, three-dimensional, interactive ambience?
We turned a city park scene into an interactive musical instrument. The player can position various rainclouds around the scene to control what objects they rain on. Depending on the material of the object (like a stone, concrete, or bush) the rain droplet will make a different sound upon impact. This way you can design your own custom rain ambience.
In addition, you can control the wind direction of the rain to create real time variation.
Made with Eva Choi and Zengdong Peng
My role was the sound designer. I recorded the raindrop sounds by dripping water onto different materials in my apartment.
Evaluation
The idea is cool. I could imagine more sandbox games leaning into soundscape customization. Like maybe in Minecraft instead of all of the water falls creating the same running water sound, maybe depending on the height and size it would make a more powerful sound. So maybe if you’re designing a home, you could have a smaller water feature that’s like a trickling pond, but if you have a massive fortress you could have an imposing waterfall that would overwhelm villains if they try to invade.
But our execution was flawed. Simulating each rain droplet didn’t necessarily add up to a convincing rain-like soundscape (not to mention the insane amount of voices we had to use haha). I think what would be more promising is to blend between pre-recorded rain sounds (drizzle -> light -> normal -> downpour) based on an intensity parameter, or just like how many clouds end up being in close proximity.