Master game audio?

Is there an easy way to master game audio? I’m wanting a way where I can monitor the overall volume of the mix, avoid clipping, etc. How do pros mix audio in GDevelop?

Thank you! Morgan.

You can play sounds on channels and then change the volume of the channel and it will change the overall volume of every sound playing on that channel.

Example:

Thank you. I understand how channel audio works. What I’m wanting to do is mix the final levels of everything and to monitor (see) it visually. I’m wondering how people know their game audio is just the right volume, in comparison to other games, etc.

It’s kind of a guess & check sort of thing. Also, when you say “mix” are you wanting to add all the volume levels together and display that level to the user?

As someone who does a lot of troubleshooting around game streaming, I can tell you: No one mixes their game audio based off other games.

Heck, many developers don’t mix their game audio off their previous entry in the series. You can have levels perfect for Game A to be at -20 to -16db in OBS, and the next game will instead be at -2 and blowing out your audience’s ears. Doesn’t matter if it’s a big publisher game or small indie.

Overall, master your game audio to itself, and give people volume sliders for master volume, music, and sfx separately. That’s the best you can do in most cases

Could an extension be made that normalizes output audio? Imagine how great that would be - basically one action ensures that your audio is right - not too quiet or distorted and clipping. Making sure that your audio is within acceptable limits is a very important part of game design. In the extreme, you could damage someone’s hearing because they had their headphones set for a standardized signal.

As far as I’m aware; Is it possible? Maybe (if not EXTREMELY complex). Is it likely? No, because that’s not normally how audio works. You can have analog clipping if your source file is clipping (due to too much gain or noise in the original recording hardware), but digital clipping will usually happen depending on the listeners hardware. If their volume is set too high and it’s a loud sound, or even if it’s a normal volume sound and their audio is too loud, you will still have clipping.

So again, you master your volume levels internally to your own sources, and that is generally the extent of what you can do. As far as I know nothing out there can truly auto level your sources (Compression filters and limiters are not instant, they try to gradually or quickly reduce the volume of something, but have no way to gauge if what it is reducing/increasing/etc).

As of yet, there is not a magic algorithm in production audio (game or otherwise) that can automatically adjust different sources to each other, because the computer does not know what “good” is.

I work in programs like reaper and audacity that show your overall volume (with the digital equivalent of a VU meter). You can normalize the audio to 0 DBFS and also avoid clipping.

I play lots of games on different systems that have standardized audio volume because they all sound good when played back to back. I can’t believe that it’s just “mix it sounding pretty good and hope it works out.” I just want to make sure that my audio volume is at a good baseline.

When I used to mix albums, my friends and I recorded, we would compare it to professional studio mixes on different speakers. Being in studios, they also played it on different speakers to make sure it was right.

I would just like to have more control over the final audio in GDevelop.

Maybe I can submit it as a feature request.

I’m totally with you, but again there is no standard audio mix levels in game dev, and the closest to a “Master mix” in the sound engine GDevelop uses is to just adjust the master volume via events, and adjust your source audio files beforehand.

Game studios in general mix against their own audio. And in many cases, it literally is “mix it sounding how you expect for your game”. Most studios will adjust gain on sound clips they’re using to their desired output mix. They do not mix to 0DB (and they wouldn’t want to, because 0DB introduces digital clipping on most PCs until you start talking about 32 bit audio, but that’s a whole different mess).

Many games DO base a desired sound level off of LUFS but there’s no standard “Be at this LUFS level to be standard with all other games”. Some studios internally use -23 LUFs (I believe EA put out a GDC talk a few years back), but there’s as much as 15DB difference depending on the game publisher, not including indie devs.

Heck, many game consoles have implemented dynamic range compression options in the OS level explicitly because of the amount of variability in game audio volume.

As far as feature requests:
The audio engine itself (HowlerJS) doesn’t track input DBs nor LUFS (nor on output), so it isn’t something the GDevelop devs could implement. The audio engine literally just has an integer for volume, and that is a multiplier for the original sound file’s volume (0 being 0%, 1 being 100%).

A feature request would either have to be with the HowlerJS library themselvesto have some form of “Total Output DBs” parameter, or you’d have to find a open source javascript sound library that supports that type of audio monitoring for channels in order for something to be displayable.

Edit: Also, unfortunately, this isn’t just a game specific thing. Video content on Twitch has different recommended DBs/LUFS than Youtube. Everyone thinks they have a better base volume, and it’s maddening.

1 Like