Audio Units (AUv3) MIDI extension – Part 0: Getting Started

Audio Units, version 3 (AUv3) MIDI extension

Move over AudioBus.
Move over InterApp Audio (IAA).
AUv3 is muscling into your territory.

Introduction

Table of Contents

“Just paint this. But we won’t tell you how.” – Apple

 

The Rozeta Sequencer Suite by Bram Bos is a collection of AUv3 MIDI extensions that almost every iOS musician or beat dropper uses. If you’d like to write one too, you’d probably go to Apple to search for the documentation and example code. But if you’ve been writing iOS audio code for more than a zeptosecond, you know just how ridiculous an assumption that is. Apple provides you with a WWDC video where this is touched upon for maybe a minute. The development fora at Apple? Buena Suerte, gringo.

So, I’ve just whacked away and found a few vague tidbits on the net until it became a bit more clear. It’s as clear as mud now, but I have a working example. If you have any improvements, corrections, or suggestions, I’d like to hear them. Really; I barely move the needle on the ego-o-meter.

An AUv3 audio unit is an app extension. You will create a “regular” app and then add an extension to it that will be the audio unit. The end user will run the regular app, which will install the audio unit(s). They will then use an AUv3 host to run your audio unit. I’ve tried AUM, BeatMaker 3, and Cubasis 2 during testing. This is the same procedure for the usual “audio” audio units.

When you create the extension, it will ask you what language to use. If you specify Swift, the generated UI code will be in Swift. The Audio Unit code however, will be Objective-C. If you do not want to code in Objective-C, you can’t write an audio unit right now. You’ve probably noticed the problems before if you’ve ever written a render callback in Core Audio using Swift. Maybe it will be possible after Swift 5 when the ABI is nailed down. Maybe.

How to

Table of Contents

Create a regular iOS app. This is what will be published to the app store. This is not the audio unit. You need it to create an extension.

Add a new target to the project. Choose the Audio Unit Extension. (Surprise!)

Here are the options.

Make up manufacturer and subtype codes. Exactly 4 ASCII characters.

For the audio unit type, choose anything. (Where’s the any key? – Homer). There is an option for a Music effect (aumf), but want another type (MIDI Processor: aumi) not presented. We’ll have to change it later.

If you look at the XCode template, these are the current types for this option. (Maybe they will update these in a later release).

Doesn’t matter here

When prompted to activate the scheme, say yes.

Here’s the generated code for your new target/extension.

Since you asked for Swift, the user interface is Swift. The audio unit is Objective-C.

The first thing to change is the audio unit type. You do that in the info.plist for the extension – not the app’s info.plist.

We want the type to be “aumi” which is a MIDI processor; an extension that is just MIDI and no audio.

set the type to aumi

Catching the busses

Table of Contents

Groovy. Now open up HeyYouAudioUnit.m (or whatever you named your audio unit;
mine is heyyou (heyyou/au. get it?)).

You will see a few warnings. You need to return two AUAudioUnitBusArrays – one for input and one for output. You will not be writing audio to these busses, but you need to provide them anyway.

I added five instance variables, asbd to cache an AudioStreamBasicDescription, _inputBus, _inputBusArray, _outputBus and _outputBusArray. In the initWithComponentDescription method, I allocate them. I’m not using the asbd later, but if I were playing a sequence, I’d want it.

So now in the accessor methods, I do this.

Caching important blocks from the host

Table of Contents

Now, add 3 more instance variables. You will not set these in the initWithComponentDescription method.

Now, in the allocateRenderResourcesAndReturnError method, you will be handed these objects by the AU host. Or not. If the host forgot to implement this, set them to nil. And then use a good host.

In deallocateRenderResources, set them to nil.

Now create an array of MIDI output names. I’m just writing to one port, so I’m returning one value in the array.

The Render Block

Table of Contents

Now we get to the action – AUInternalRenderBlock! The method is stubbed out for you. Inside the returned block, we can play around with the MIDI events which are passed in via realtimeEventListHead parameter. This method is called for other purposes, so you need to check the head to see if this time it’s MIDI data.

Now since it’s a “list” of events, iterate through them. The AUMIDIEvent is contained in the render event. It’s convenient to break out the individual data here.

OK, do stuff. Like what? Well, change the data and send it back or add data.
I’ll do something obvious so you can hear it. I’ll add a note event that is pitched a major third above the note handed to me by the host. So you’ll hear a dyad.

Wait, I said “send it back”. How?

Remember that _outputEventBlock you cached? Put the bum to work now.
Hand it a time, a cable which is a virtual port (just use 0), and array of midi bytes and how many of those bytes you’re sending.

That’s almost it. In the real example on Github, I capture _outputEventBlock and use that in the block – just as you would if you were doing audio (Apple’s demo captures the C++ DSP kernel class for example.) We’re not doing audio, but the render block is called at audio rate, so just apply the same rules as if it were audio. The rules? Don’t use self, don’t call Objective-C, don’t call Swift, no IO. Calling C++ is OK. (Actually, C++ is everywhere in iOS audio programming, so you might want to brush up on it.) Essentially, nothing that may block.

You can run the app on your device, but there’s nothing to see. It will register your extension(s). But you can just choose your plugin scheme and run on the device. This is a lot easier – you won’t have to attach to the extension process this way. You will be prompted for a host. Here is what it looks like in AUM:

I have other plugins, but you can see HeyYou as a MIDI Processor.

After loading HeyYou, I loaded a synth. In this case Kauldron. Use what you have. I then set its source to HeyYou. Here you can see the MIDI name we set – “HeyYouMIDIOut”.

Then I showed the AUM keyboard, set its destination to HeyYou via the wrench icon, then I mashed the keys and heard glorious dyads.

What about those other blocks we cached? Well, for what we’re doing here, we don’t need them. But if we were playing a series of MIDI messages, we’d want to know the tempo and when the transport changed (e.g. the user pressed the play triangle button in AUM).

That’s left as an exercise for the reader 🙂
(it will actually be the blog post after the next – in which I’ll cover parameters.)

Summary

Table of Contents

So now you’re receiving and sending MIDI data from an AU host. Aren’t you a bad ass now. Cool.

We did nothing with the UI. What if you wanted to allow the user to be able to set the interval? How do you set parameters? You probably saw the stubbed out parameter tree code. That’s how. I’ll do that in the next blog post.

Resources

Table of Contents

Github Project

Bram Bos on AUv3 MIDI extensions.

Chris Adamson’s AUv3 Brain Dump. This is for audio, not MIDI. Useful nevertheless.

Apple – App extensions in General

Apple – Audio Unit Extensions

Apple’s Nascita di Venere

5 thoughts on “Audio Units (AUv3) MIDI extension – Part 0: Getting Started”

  1. I’m curious whether you’ve looked into this again recently? We’re thinking about integrating AUv3-MIDI in our app, but it’s already a standalone Swift app so I’m not clear how to proceed. I followed the first steps of this tutorial, but noticed that it did, in fact, give me a Swift AudioUnit file (i.e., not obj-c). So things have obviously changed. Do you have any further info on the current state of affairs? As I mentioned, we want to add AUv3-MIDI support to an existing app. Any help on how to go about achieving that would be immensely helpful.

    1. Xcode 11 introduced new templates. The Xcode 10 (and earlier) templates used Objective-C++.
      There was really no reason that most of the Audio Unit couldn’t be in Swift, so they fixed that.
      The real time frobs i.e. the render block is still C++. Can’t call C++ from Swift, so that’s why there’s that “adapter” class.

      AUv3 Audio Units are app extensions that run in a host. AUM, GarbageBand, etc. If you want to run an AU in a stand alone app, then you will need to act like a host and provide the necessary hooks. The musicContextBlock for example. This is quite a bit of work, but not impossible.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.