The game plays the clap sounds on the first 3 beats of the bar. These are played in time with the backing track.
When the user taps on the screen, a clap sound is played and the game checks whether the tap occurred within an acceptable time window.
Oboe provides the AudioStream
class and associated objects to allow the sample to output audio data to the audio device. All other objects are provided by the sample.
Each time the AudioStream
needs more audio data it calls AudioDataCallback::onAudioReady
. This passes a container array named audioData
to the Game
object which must then fill the array with numFrames
of audio frames.
The sample uses the following optimizations to obtain a low latency audio stream:
The IRenderableAudio
interface (abstract class) represents objects which can produce frames of audio data. The Player
and Mixer
objects both implement this interface.
Both the clap sound and backing tracks are represented by Player
objects which are then mixed together using a Mixer
.
It is very important that the audio thread (which calls the onAudioReady
method) is never blocked. Blocking can cause underruns and audio glitches. To avoid blocking we use a LockFreeQueue
to share information between the audio thread and other threads. The following diagram shows how claps are enqueued by pushing the clap times (in milliseconds) onto the queue, then dequeuing the clap time when the clap is played.
We also use atomics to ensure that threads see a consistent view of any shared primitives.
When a tap event arrives on the UI thread it only contains the time (milliseconds since boot) that the event occurred. We need to figure out what the song position was when the tap occurred.
To do this we keep track of the song position and the time it was last updated. These values are updated each time the onAudioReady
method is called. This enables us to keep the UI in sync with the audio timeline.
Once we know when the user tapped in the song, we can calculate whether that tap was successful i.e whether it fell within an acceptable time range. This range is known as the "tap window".
Once we know the result of the tap the UI is updated with a color to give the user visual feedback. This is done in getTapResult
.
Note that once a tap has been received the tap window is removed from the queue - the user only gets one chance to get their tap right!
In order to reduce APK size this game uses MP3 files for its audio assets. These are extracted on game startup in AAssetDataSource::newFromCompressedAsset
. A yellow screen will be shown during this process.
By default the game uses NDKExtractor
for asset extraction and decoding. Under the hood this uses the NDK Media APIs.
There are some limitations with this approach:
A faster, more versatile solution is to use FFmpeg. To do this follow the instructions here and use the ffmpegExtractor
build variant found in app.gradle
. The extraction will then be done by FFmpegExtractor
.