Thursday 18 November 2010

Make some noise! (Audio on iOS)

There are two main APIs on iOS for playing programatically-generated audio:
Audio units are a lower-level, lower-latency technology. And also, a relatively complex API. Audio queues are higher-level and simpler to use, but have far greater latency.

I have produced a sample project showing the simplest creation of an audio unit and audio queue output so you can compare and contrast the two APIs. Get it on Gitorious here.

The following series of blog entries will describe each technology.

About the project

The project builds a sample iOS application that plays a steady sine wave as long as it is running. If you look at main.mm you'll see a single #define for USE_AUDIO_QUEUE_OUTPUT that can be used to switch between the audio unit and audio queue back ends.

There are extra things a well-behaved iOS audio app should do that are not (yet) illustrated by this example project. Notably, your app should use the Audio Sessions API to define what kind of audio session it requires and to handle interruptions to the audio output (e.g. when a phone call it received whilst your app is running).

Make some noise!

Before we work out how to make noise with our iOS devices, we need some audio to start with. For this reason, the AudioProducer protocol defines a simple interface that the audio backends can grab an audio stream from.

It looks like this:
/// Type of audio sample produced by an AudioProducer
typedef SInt16 Sample;

/// Protocol for objects that produce audio
@protocol AudioProducer

@property (nonatomic) float sampleRate;

/// Fills a buffer with "size" samples.
/// The buffer should be filled in with interleaved stereo samples.
- (void) produceSamples:(Sample *)audioBuffer size:(size_t)size;

@end

That is, we'll be peddling signed 16-bit integer samples, and have a single method that can be called to grab the next block of interleaved stereo audio samples.

Given that, I have written a simple SineWave audio generator. It's not the most elegant generator, I'll admit. It uses a resonant filter to approximate a sine wave rather than the maths library trig functions for performance reasons.

The SineWave interface adopts the AudioProducer protocol, and adds two extra properties - the frequency of the sine wave and the peak (amplitude) of the wave. You can see the interface here and the implementation here.

Atomicity

Note that since this is a simplistic example I have made all of these properties nonatomic. However, both audio units and audio queues will pull audio from your application in a background thread (not the main user interface thread).

The audio unit uses a very high priority background thread, as it is a very low latency audio pipeline with little buffering. The audio queue thread is not set as high-priority, as it employs a large amount of buffering.

You must bear these threads in mind when writing an audio generator. Ensure that any parameter that can be changed is thread-safe. Make sure that if the UI is in the process of adjusting values and gets interrupted by the audio thread that no disasters (e.g. nasty audio glitches) can result.

What this looks like in practice is different for each application. But this is an important warning to heed.

Next time

So now we have some audio to play, next time we'll look at how to use the audio queue APIs to play it.


Friday 12 November 2010

Writing: This Time I've Got It...

The November issue of ACCU's C Vu magazine is out now. It contains the latest installment in my Becoming a Better Programmer column. This one's called This Time I've Got It... and is a software development parable. As the article strap-line puts it: Pete Goodliffe tells us a story of stress, short-sightedness, and solutions.

The magazine should have landed on ACCU members' doormats already. The PDF version is available for download on the website.

Tuesday 2 November 2010

Sending MIDI through CoreMIDI

The 4.2 iOS GM seed is out. CoreMIDI is coming to the masses. So I've updated my CoreMIDI example to show how to send MIDI out through the USB port. (I've had a lot of people ask me for this.)

Go to the GitHub project page and check out the update. It's a simple addition, but shows the simplest way of getting MIDI data out of your application and through CoreMIDI.

Interestingly, the later - and the GM - versions of iOS 4.2 are far less forgiving of attached USB control devices. Most of the devices I test with are rejected by the iPad because they draw too much power. These are devices with LEDs, but that are certainly within USB power specifications. That's quite a shame.

As ever, I welcome feedback about this example project. Let me know if you find it useful.