Problems/Suggestions : Click here and post in the forum or mail to 5argon@exceed7.com

iOS Native Touch

Low latency iOS touch event. Straight from the OS.
Requirement : Unity 2017.1 and above. For iOS platform only.

Latest : v1.0 (25/12/2017)

Release Note

Did you know your iOS device could do better?

And it's Unity's fault. Out of curiousity, this plugin was developed when I noticed that my iOS game has a considerably worse audio latency than other music apps such as Garage Band installed on this same device. I have confirmed by creating a basic Unity app which just play an audio on touch down vs. a basic XCode iOS app which also play an audio on touch down. You can clone the project on GitHub to confirm this by yourself.

If you can't feel the difference, I encourage you to do a simple sound wave recording test. Keep your phone at the same distance from your computer's mic and tap the screen nail-down. The sound wave interval between the peak of nail's sound and the peak of response sound should immediately reveals the difference visually. Newer device might have a better latency, but the difference between Unity and native app should be relative for every devices.

It's because of the touch

Naturally, I suspected that Unity has added some audio processing pipeline and proceed to develop a Native Audio Plugin.

Which helps, but more than half of the difference still remains! What else could be the difference between this simple Unity and XCode app? (Besides the possibility that my Native Audio plugin sucks) Thinking carefully, my app test sound on touch and turns out the real culprit is the touch handling.

I found that instead of using either polling Input.touches or waiting for uGUI's EventSystem's TouchDown event, letting the OS tell Unity directly whenever a touch happen is a lot faster. After making an another test app which applies both iOS Native Touch and Native Audio, the audio latency of Unity is now comparable to XCode ones.

Evidence backed by an extensive personal research

I developed this plugin as a solution for my own game which needs to gain every possible little bit faster feedback. Please watch this 1-take video which is my final proof that the plugin will have any benefit.

A detailed write up of this experiment is available here

You can also watch my first attempt with a wrong conclusion or read the first experiment's note here.

Is it THAT much faster? What is the significance?

No! Probably not significant at all for most games. That's why there are hundreds of existing iOS Unity games out there with this touch delay and people are enjoying them without complaints. Tap, flick, swipe, pinch. They all worked properly.

By all means not an easy drop-in touch handling replacement. iOS Native Touch lets iOS reports faster touch event directly to Unity in exchange for more difficult setup.

Native touch event checking approach is not the same as usual. iOS Native Touch will make your life harder in exchange for a little speed gain. If you care about THAT possible speed-up, you know you want it. We are the same! : )

( View bigger ) ( Download this Ableton Live project ) ( Projects used : 1 2 3 4 )

This is a project showing a nail-peak to response sound wave of various circumstance. Don't pay attention to the time interval number since this measurement is non-standard, it is useless. (loopback cable latency test on iOS usually results in something as low as 10ms)

iOS XCode Native iOS Unity + Best Latency iOS Unity + Native Audio iOS Unity + Native Audio + Native Touch
Difference from native - 45 ms 33 ms 16 ms
Latency reduced from the previous step - +45 ms -12 ms -17 ms
How much does a particular step helps - -100% 26.67% 37.78%

But take note of the difference. We are 45ms away from the native XCode latency. Native Audio helps by -12ms (26.67%) and iOS Native Touch helps by -17ms (37.78%)

23ms is approximately an acceptable latency for a pianist. iOS Native Touch + Native Audio together can make about this equivalent difference on an iPod Touch Gen 5.

1. Where it matters : for the fastest touch

If your game is mainly not just a simple touch or swipe/flick, but a gesture action like a grid-type game that is not a match-3. Maybe it requires a complex line drawing as the main experience. It's always a good idea to improve an area that your player will be experiencing the most often like the core gameplay no matter how small an improvement it is. It will multiplies and make a difference.

2. Where it matters : for better audio feedback experience

It sounds funny that a faster touch fixes an audio latency problem. But thinking carefully all the bad audio experience starts from user's input. Be it a button feedback sound, a drumming app, or a music games that has a sound feedback. The keyword here is "feedback".

By default Unity does a good job playing an audio as soon as you issue them. That's why my Native Audio does not help much. Even if it doesn't, you can compensate by playing the sound earlier ("audio calibration") as is the common solution on Android.

But the problem is usually that an audio is issued immediately as a result of player's interaction, which would be a touch unless you are making a maracas-shaking game. You can't predict that the sound will play at which moment because you have to wait for the touch, and probably more processing of that touch. (Is it at the correct position? At the correct moment? "Perfect" sound or "offbeat" sound?) You can't compensate. And the bottleneck you must go through is the touch event.

Click to learn more about 3 classes of musical apps.

3 Classes Of Musical Apps

Sequencer

Application like digital audio workstation (DAW) on mobile phone or live performing musical apps like Looper, Launchpad falls into this category. The app is interactive, but the reference of what is the "correct" timing are all controllable. Imagine you start a drum loop. Each sound might have delay based on device, but all delays are equal, results in a perfect sequence albeit variable start time. When starting another loops, it is 100% possible for the software to compensate and match the beat that is currently playing. This class of application is immune to mobile audio latency.

Instrument

Apps like GarageBand (in live playing mode) is in this category. The sound have to respond when you touch the screen. A latency can impact the experience, but if you are rehearsing by yourself you might be able to ignore the latency since if you play perfectly, the output sound will all have equal latency and will be perfect with a bit of delay.

Music Games

There are many music games on mobile phone like Cytus, Deemo, Dynamix, VOEZ, Lanota, etc. If there is a sound feedback on hitting the note, this is the hardest class of the latency problem. Unlike Sequencer class, even though the song is predictable and the game know all the notes at all points in the song you cannot predict if the sound will play or not since it depends on player's performance. (Unless the sound is played regardless of hit or miss or bad judgement, then this class can be reduced to Sequencer class.) It is harder than Instrument class, since now we have backing track playing as a reference and also a visual indicator. If you hit on time according to the visuals or music, you will get "Perfect" judgement but the sound will be off the backing track. When this happen, even though you get Perfect already you will automatically adapt to hit earlier to make that respond sound match with the song, in which case you will not get the Perfect judgement anymore. In the Instrument class, if you are live jamming with others this might happen too but if you adapt to hit early you can get accurate sound and not be punished by the judgement like in games.

What I am making is a music game. Even a little bit of latency will be very obvious. Since there is a beat in the song for reference, players will be able to tell right away that he/she is hearing 2 separate sound (the beat in the song and the response sound) even if the player scores a perfect.

Compared to Android, iOS has always been praised by its godly low touch + audio latency so low that you feel like you are actually touching a real cymbal in Garage Band. Do not let Unity ruins this reputation!

(What about Android?)

I have conducted a research on Android also. While Native Audio helps, surprisingly native touch does not help at all. It's either that Unity does not add any more touch processing to Android, a device's fault, my fault, or the overhead that Java talks back to C# offsets the benefit of native touch. But if I can't really confirm that it is faster by myself I can't try to sell it.

Back to the top