MaskShrine_Banner.png

Music Integration

I was in charge of working with our composer and sound designer to integrate all progression based music cues and track changes throughout the game in both Unreal and Wwise.

My direct responsibilities involved working with our engineers to develop the tools necessary for this, communicating directly with our composer to achieve his vision, and all Unreal implementation of music cues and volumes within each level. These cues were also integrated within Wwise, and troubleshot to make sure both Unreal and Wwise were working seamlessly together.

In working with our engineers to develop the tools necessary for implementation, our composer would relay to me what he required, and I would communicate that to the engineers to brainstorm on how we wanted to achieve that effect in Unreal. During this back and forth communication, I would bring these tools back to the composer, and see if it was working to his expectations. Together, we created an Airtable to track all of our music changes, with over 300 rows of musical transitions, stingers, or tracks.

In all the levels throughout the game I would:

  • Create music volumes to encompass each area of gameplay

  • Add global variables that would change alongside gameplay progression to affect the music

  • Create other music volumes that would drive music levels via player distance

  • Add triggerable stingers based off location and progression

  • Create the Wwise music containers that matched our gameplay progression, and build the structure for the composer to slot in his music

  • Adjust Wwise transitions, RTPC, and music timing to create our desired effect

  • Profile the game in Wwise and Unreal for bugfixing

  • Bugfix so that all progression worked alongside our save/load of regular gameplay progression

 

Music Maps created as a visual guide for the composer and me to plan the space.

 

 An example of the music system along with the gameplay.