
intermediate
INTERACTIVE AUDIO
I have created copyright-free interactive audio for my Intermediate Interactive Audio project, integrating Wwise and Unity software for the "Courtyard" videogame level. The audio used have been sourced from different sound libraries, such as my expanding sound library, sounds given by the university module leaders, and some other sounds have been recorded by myself only for this project and then added to the sound library, using an H5 Zoom Handy Recorder and a Rode NT1-A. Most sounds have been edited beforehand using ProTools, and afterwards, they have been imported and edited if necessary in Wwise. All clips have been linked to various objects from the Unity scene, either added as an extra object or objects already there. Some objects required the use of scripts such as the FPS Controller or the day-light cycle, and they were edited in Visual Studio.
In Wwise, the sounds were used for different aspects of a game like ambient, foley, character effects, SFX, and dialogues to provide every object and event with a suitable sound according to an abandoned doom or temple in the middle of the desert, in a few years from now. All the clips were chosen to be coherent with this time and place. Before finishing, all sounds were mixed in Wwise, with mixing desks and sound casters provided by the software itself.
Various containers have been used to group all the audios and wrap everything up. Random containers were used habitually to create a sense of randomization in the game, create a more natural environment, and prevent a non-repetitive setting. This randomization process has been used in different aspects, such as the ambience animals, where many silences were added not to overwhelm the scene with sounds. Also, in the character's footsteps, randomization has been used to hear different sounds every time the player touches the surface, and they change their sound depending on which surface the character walks on. The surface changing sounds work thanks to Switch states and scripts added in Unity, which are explained in the technical notes in more detail. To create sound in a specific location, event locations are used in specific objects, so the character will trigger them as it explores or approaches different temple areas. Some examples would be fluorescent lights, small waves, tension heartbeats, music or secondary character voices. For controlling this interaction between the player and the objects in the scene, colliders have been set up and used so when the character enters in a certain distance from the objects, the sounds set up will be triggered.
All of this helps to create a reliable environment and aid immersion.
If we talk about music, some vertical and horizontal music have been implemented, which change depending on where the player is. Some music tracks will play emptier or fuller depending on the character's distance from a particular object. This has been done thanks to states and Real-Time Parameter Controllers. The NPCs are used mostly in horizontal music tracks.
The game is structured as a pickup game: the player needs to collect some coins for a strange character. We are not sure why, but something happens after the character picks up all the coins.