Problems/Suggestions : Click here and post in the forum or mail to 5argon@exceed7.com

Native Audio

Lower audio latency via OS's native audio library.
Requirement : Unity 2017.1 and above. For iOS and Android.

Latest : v1.0 (25/12/2017)

Release Note

Unity adds audio latency

This plugin was developed when I noticed that my iOS game has a considerably worse audio latency than other music apps such as Garage Band installed on this same device. I have confirmed by creating a basic Unity app which just play an audio on touch down vs. a basic XCode iOS app/Android Studio app which also plays an audio on touch down. You can clone the project on GitHub to confirm this by yourself.

If you can't feel the difference, I encourage you to do a simple sound wave recording test. Keep your phone at the same distance from your computer's mic and tap the screen nail-down. The sound wave interval between the peak of nail's sound and the peak of response sound should immediately reveals the difference visually. Newer device might have a better latency, but the difference between Unity and native app should be relative for every devices.

Calling into the native side directly

Unity has an internal mixing and automatic audio track management system, backed an another layer by FMOD. That's why you can go crazy with a method like AudioSource.PlayOneShot without creating any audio track and the sound magically overlaps over itself like you own many AudioSource. You didn't even load the audio and it just works! This is great design for a game engine.

But unfortunately all the things adds more and more audio latency. For some genre of apps and games that needs critical timing for audio this is not good. Naturally, the idea to fix this is to send audio data and directly call into the native methods, bypassing Unity's audio path entirely.

I have researched into the fastest native way of each respective platform and found that for iOS it is to use AVAudioPlayer (Objective-C/Swift) and for Android it is to use AudioTrack (Java). For Android you might know that there is OpenSL ES that is even more native, accessible with NDK and C++. But from my extensive tests I found no difference from AudioTrack at all latency-wise.

But having to interface with multiple different set of libraries separately from Unity is a pain, so Native Audio is here to help...

Native Audio is a plugin that helps you easily loads and plays an audio using each platform's fastest native method, from the same common interface in Unity.

Evidence backed by an extensive personal research

I developed this plugin as a solution for my own game which needs to gain every possible little bit faster feedback. Please watch this 1-take video which is my final proof that the plugin will have any benefit.

A detailed write up of this experiment is available here

You can also watch my first attempt with a wrong conclusion or read the first experiment's note here.

Native Audio is only one part of the solution

From the research video I have concluded that the best audio experience is hampered by BOTH audio latency (solved with this Native Audio) and touch latency.

Native Audio helps reduce the audio latency, but you can further reduce the perceived audio latency with iOS Native Touch. (In fact it helps even more than Native Audio)

Visit iOS Native Touch

Unfortunately on Android using Native Touch doesn't help, so Native Audio is all I can do to fight with the latency problem.

How much faster can Native Audio helps?

Each device is going to have a different latency improvement depending on how each device "handles" things that Unity adds. (If it handles badly, then we will get a bigger gain bypassing to the native side.) But for starters, let's look at my iPod Touch Gen 5 and Nexus 5.

( View bigger ) (Download this Ableton Live project ) ( Projects used : 1 2 3 4 )

This is a project showing intervals of the peak of nail's sound hitting the touchscreen to the peak of response sound wave. Don't pay attention to the time interval number since this measurement is non-standard, it is useless. (loopback cable latency test on iOS usually results in something as low as 10ms) But instead we should focus on a difference from native time

iOS XCode NativeiOS Unity + Best LatencyiOS Unity + Native AudioiOS Unity + Native Audio + Native Touch
Difference from native-45 ms33 ms16 ms
Latency reduced from the previous step-+45 ms-12 ms-17 ms
How much does a particular step helps--100%26.67%37.78%

On iOS, adding Native Touch helps more than Native Audio. Having both together we have reduced 64.45% of the total 45 ms difference from an ideal native time. In one of my test I even managed to reach 100% and match a native performance, but on average it is like this. And don't forget that an iPod Gen 5 is quite old, newer device might even have a better native performance due to new chips, etc.

Android Studio NativeAndroid Unity + Best LatencyAndroid Unity + Native Audio
Difference from native-34 ms28 ms
Latency reduced from the previous step-+34 ms-6 ms
How much does a particular step helps--100%16.67%

NOTE : If you didn't use "Best Latency" (small buffer size) audio settings in Unity before comparing with Native Audio, the improvement will "appears" to be larger.

For Android, the benefit of Native Audio is so pitiful that I almost want to drop the support altogether. But since Android is always fighting with the latency, there might be a chance that in newer devices or future versions the native method might do a better job than this. (For example they just added an interesting flag to the native API in Oreo) And at the same time there might be a moment that Unity is late in taking advantage of some new native features. In the end I decided to keep Android support.

Why do you need... ?

Certain kind of games or apps relies heavily on feedback sound. The keyword is not an "audio application" but "feedback sound". For example if you are making a music player, that is clearly an audio app but audio latency won't affect the experience at all because all the interaction you do is press play and listen. It's not like if the song starts a bit late then an entire song is ruined. The core experience is on the song itself not a timing.

But if a feedback sound lags? It is not concerning for non-gameplay elements like a UI button that sounds when you press it, but imagine a drumming game that you have to hit at the correct moment. If you hit perfectly and the game says so, the sound will come later. If you hit early the game punishes you, but the sound will be exact. It's this kind of problem.

Click to learn more about 3 classes of musical apps.

3 Classes Of Musical Apps

Sequencer

Application like digital audio workstation (DAW) on mobile phone or live performing musical apps like Looper, Launchpad falls into this category. The app is interactive, but the reference of what is the "correct" timing are all controllable. Imagine you start a drum loop. Each sound might have delay based on device, but all delays are equal, results in a perfect sequence albeit variable start time. When starting another loops, it is 100% possible for the software to compensate and match the beat that is currently playing. This class of application is immune to mobile audio latency.

Instrument

Apps like GarageBand (in live playing mode) is in this category. The sound have to respond when you touch the screen. A latency can impact the experience, but if you are rehearsing by yourself you might be able to ignore the latency since if you play perfectly, the output sound will all have equal latency and will be perfect with a bit of delay.

Music Games

There are many music games on mobile phone like Cytus, Deemo, Dynamix, VOEZ, Lanota, etc. If there is a sound feedback on hitting the note, this is the hardest class of the latency problem. Unlike Sequencer class, even though the song is predictable and the game know all the notes at all points in the song you cannot predict if the sound will play or not since it depends on player's performance. (Unless the sound is played regardless of hit or miss or bad judgement, then this class can be reduced to Sequencer class.) It is harder than Instrument class, since now we have backing track playing as a reference and also a visual indicator. If you hit on time according to the visuals or music, you will get "Perfect" judgement but the sound will be off the backing track. When this happen, even though you get Perfect already you will automatically adapt to hit earlier to make that respond sound match with the song, in which case you will not get the Perfect judgement anymore. In the Instrument class, if you are live jamming with others this might happen too but if you adapt to hit early you can get accurate sound and not be punished by the judgement like in games.

What I am making is a music game. Even a little bit of latency will be very obvious. Since there is a beat in the song for reference, players will be able to tell right away that he/she is hearing 2 separate sound (the beat in the song and the response sound) even if the player scores a perfect.

Back to the top