DEV Community

Cover image for So, You Want to Know A Little More About Video Game Sound?
arindavis
arindavis

Posted on

So, You Want to Know A Little More About Video Game Sound?

Sound design is one of those creative skills that, when done just right, will go completely unnoticed by the end user. If a sound calls attention to itself outside the realm of immersing you deeper into an experience, then the designer isn't doing their job right. That line between immersion and distraction is even more blurry and tenuous than you, the average game enjoyer, may even realize.

Quick Disclaimer: Although I have experience in sound design, musical composition for films and general full stack software development, I have not worked in the game industry. This week's blog is meant to be used as a stepping stone for someone who knows next to nothing about sound design in video game development and wants to be pointed in the right direction.

So where should we start with such an all encompassing topic? The history, of course.

When the first commercial games were being released in the seventies, most video game consoles had less sound rendering capability than a modern dishwasher. Take, for example, Pong. An all time classic.

All of the distinct sounds you hear here, the sound of the pong hitting a player's paddle, hitting the wall or even scoring a point, are all the same exact sound, simply stretched or squished to give a desired effect. Even with the strict limitations of the Atari hardware, the developers made it work. This is a great precursor to everything else we are going to talk about.

As the industry progressed into the 80s and 90s the tech got a lot better, especially with the rise of companies like Nintendo, but the same limitation that plagued Atari kept rearing its ugly head: memory. The average size of a game on the Nintendo Entertainment system (1983), often shortened to NES, was a mere 384kb. As if there wasn't already enough creative restraints, the NES was limited to a handful of audio channels.

All of that considered, here's what Koji Kondo, composer for Super Mario Bros, was able to do with it:

From wikipedia:

The game's melodies were created with the intention that short segments of music could be endlessly repeated during the same gameplay without causing boredom. Kondo's soundtrack to Super Mario Bros. gained worldwide recognition, and is the most well-known video game score in history. The main theme is iconic in popular culture and has been featured in more than 50 concerts, been a best-selling ringtone, and been remixed or sampled by various musicians.

The lesson we take from these two famously cherry picked examples? The limitations of the hardware can often lead to major creative innovations.

Fast forward to the turn of the century. The video game industry is growing at exponential rates. Releases like Doom(1993), Half-Life(1998) and Halo(2001) have continually broken financial barriers and video games are becoming mainstream.

With that rapid rise in popularity comes more standardization across the industry. As the high expectations continue to grow, game studios find themselves turning to already constructed game engines like Unreal Engine, and audio middle-ware like Wwise.

image

Which is still the widespread standard today!

Wwise acts as a sort of DAW(digital audio work station) for video game sound designers/composers, with a heavy emphasis on the logic and event based timing in the game world. It works with the game engine's event triggering systems to insure the right sound plays when it needs to, how it needs to, and in the correct place in relation to the player.

This tool is used by sound designers and composers alike, as musical timing can be the key to maintaining our suspension of disbelief when in a game world. One of my all time favorite musical queues in a video game can be found in Naughty Dog's The Last of Us:

And although it might seem like deceptively simple implementation, play the music when the cute animals appear, there's a lot of work happening behind the scenes to make sure the timing is just right. The player could choose to walk slowly or at a brisk pace, wait around before getting there or even try and speed through it as fast as possible and the music still needs to sound organic. No matter what. That means that the music itself, including the build up, execution of the leitmotif and resolution, need to dynamically change depending on the players input.

Without the player ever noticing. Seems like a tall order, right?

And even with all the fancy advancements we have made in the realm of game technology since Pong, memory is still the primary limiting factor for most developers. Modern AAA games often exceed 100 gigs, which could fit millions of copies of Super Mario Bros. with absolutely no issue. A big contributor to that bloated size is audio. Especially with open world games like CyberPunk 2077, Read Dead Redemption II or GTAV, where musical cues/sound objects/event systems become exponentially more complex as the game world expands outwards in all directions.

Like being a software dev, being a sound engineer/composer in the games industry seems like it can often be a thankless and tiring job with a few gleaming moments of resounding success. At best, the audience won't even notice you did anything, and at worst you will completely take them out of the experience. But when you get it right, like we've just explored, you are able to create a seamless magic act that leaves your player with something long after they put the controller down.

Top comments (0)