Ray, in it current form, is a game prototype made in the span of 3 months. An official demo is being worked on as we speak, with the goal of bringing this game to its full potential !
If you wish to support this project or simply give feedback, do not hesitate to contact our team at contact.raygame@gmail.com !
Let’s review more in details the sound design of this project, from the conception to the integration.

Ray was made in Unreal Engine 5.5 by a team of around 10 people. I worked alone on the sound design and music composition but I had the precious help of my team’s feedback and ideas.
The entirety of the sound creation process was made on Ableton Live 11, using my Zoom H5 recorder and my MV7 mic for material when needed. For the integration, everything happened from withing the game engine, working with GitHub as a collaboration medium.
The video below is a collection of examples of the way Ray’s sound design in the game engine.
From the beginning of the game’s production, I kept track of my ideas and progress on my moodboards and list of sounds. Those documents, of course, also served as a communication tool, where my teammates could leave comments and feedback in dedicated spaces, or use it as a naming and classification reference in work meetings.
Being able to talk about not only the quality, but also the use, the length, the metadata, the behaviour in game, being able to have common vocabulary helped facilitate the communication between the production team, the dev team, and me.
Below: The sound progression and listing spreadsheet.
This project was the first made with Unreal Engine for many of us including me. Blueprints, Sound Cues, Sound Classes… I learnt the terminology and the workflow step by step, getting also used to the plugins within the Sound Cue editor.
Below: The Sound Cue for the waterfalls, using distance as a crossfade parameter between closeby sound and faraway sound.

The music in particular was modified in its composition for the sake of the integration, the integration process itself becoming part of the composition for the game.
For those 3 months, I got familiar with the game engine, sometimes getting help from our talented developpers when I needed to integrate the sound in the level’s Blueprint.
One of the most complex examples was the way the adaptative footsteps were made.
Below: The Blueprint called BP_FootstepAnimNotify

This Blueprint’s code created an invisible detector ray under the protagonist’s feet, analysing the Material Properties it goes through.
Those Material Properties were assigned to 5 different categories:
- Grass/Earth
- Wood
- Rock
- Ice
- Plant plateform
This Blueprint is made to be put onto the protagonist’s walk cycle timeline. When it is triggered, it calls onto the Sound Cue that corresponds to the detected material, each Sound Cue properly randomising the sample and pitch variation.
The aventure is far from over, since our work on the demo is bringing in new concepts, new sounds, new ideas to explore, and of course with it the occasion to polish the sound integration and mixing even more.
