Category

Swift and Core Audio

Swift Language

Like many of you, I’ve been knee deep into Swift this week. Once you get beyond the hello world things and try something a bit more complicated, you start to learn the language. So, why not go off the deep end and try to work with what is essentially a C library: Core Audio? Turns out that’s a good way to get the types grokked.

First, don’t try to use core audio in a Swift playground. I wasted a day trying to do this. It doesn’t work yet. So, create a project.

Update: In Swift 2 you can create an AUGraph in a playground. You just need this at the top of your playground.

I couldn’t do something as simple as this in the playground:

So, ok I put that in a Swift class as an instance variable and created it in the init function.

Most core audio functions return a status code which is defined as OSStatus. You need to the type on the var.

Or, if you want, you can cast noErr like this.

Here’s an adventure with Boolean.

The function AUGraphIsInitialized is defined like this:

So, you call it like this:

That works. But how do you check it?

Boolean is defined as an CUnsignedChar (in MacTypes.h)

So, you cannot do this:

And you cannot cast it (could not find an overload…)

or with Swift’s “as”

I’m clearly overthinking this. Because this is all that is needed. D’oh!

Update: In Swift 2, you use DarwinBoolean instead.

Another problem I had was using constants such as kAudioUnitSubType_Sampler while trying to create an AudioComponentDescription. The trick was to simply cast to OSType.

Resources

Here is my Github repository for this project. It’s a simple 3 button iPhone app that plays sine tones (based on the button’s tag value).

11 thoughts on “Swift and Core Audio”

    1. Bob, send me the code what you’re trying to do. Are you generating samples in code?
      You’d do use the floatChannelData variable

      1. Hi Gene,

        Yeah, Im trying to generate samples into code. I can produce sound, but when I try and change the frequency variable I get a modulation effect. The other strange thing is when I change the frame count, the pitch changes. I’m guessing that has something to do with the buffer being looped.

        Thanks for any help.

        Bob

  1. Hi Gene, Many thanks for posting this.

    I am translating an Objective-C application which – like you – I am using to kick Swift’s tyres.

    I had reached the complicated bit of calling the Audio library, and was a bit daunted. But your article has pointed the way!

    Cheers,
    Philip

    1. Most people would write another Weather or To Do app to learn a new language.
      Not us.
      We do the hardest thing on the platform. 🙂

      Glad to be of help

  2. Gene, thanks for the article. I think you’re representing a lot of people, like myself and others above, who take CoreAudio seriously. I esp. take it seriously because I’m using iOS to improve my hearing and also “try” and mask my tinnitus (I’ve tried so many hearing aids; the industry is knee deep in earning huge gross margins – mfg’s and audiologists; not in quality).

    Swift and the Swift Playground, REPL, are much more conducive to researching new ideas much like OOPS like Smalltalk and Actor once did. Please add me to your mail list if you should continue to get deeper into the subject. I’m looking into trying to bridge STK (Synthesis Toolkit).

    If you find any other resources or books to be published, I’d love to know. Thx.

  3. Thanks for this. A lot. I even thought I’d use that gittip button but it didn’t work… 🙂

    1. Here is an example:

  4. Hello Gene De Lisa!

    Do you know, is it possible to play AVAudioPlayer’s MPMediaItem (Apple Music content) with equalizer?

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.