Swift 2 AVAudioSequencer

Swift Language

Swift 2 AVAudioSequencer

There’s a brand new MIDI sequencer class in Swift 2 beta! It’s the AVAudioSequencer.

Introduction

Table of Contents

At WWDC15 there was a presentation entitled “What’s New in Core Audio“. If you were able to get past the first 29 minutes of a poorly structured presentation delivered by a robotic mumbling developer who just reads the lines in the slides (just like 90% of other WWDC presentations), you heard about this. But then, just like every other WWDC presentation, there were incomplete code snippets.

So can we get this to work?

Sequencer setup

Table of Contents

You can play a MIDI file with the old AVMIDIPlayer class. I published a post on this back in the stone age of Swift.

Here is the old Swift 1.2 code to create one. (Swift 2 has updated the old NSError cha cha.)

Swift 2 now has a new AVAudioSequencer class.
Woo Hoo!

Ok, let’s make an AVAudioSequencer!

I’ll talk about the AVAudioEngine set up next.

So, I load a standard MIDI file that I created in Sibelius, tell the sequencer to read it, then start the sequencer. The API doesn’t look too bad at this point.

AVAudioEngine setup

Table of Contents

Let’s create the engine. According to the presentation, there doesn’t need to be much more than a sampler in the engine and the totally groovy new AVAudioSequencer will find it.

That’s all – according to the ‘AudioEngine’er’s presentation.
Good to go, right?
Wrong.
That should be it. But it’s not.
What do you get?

The ‘rioc” is the outputNode. See it?
See that the mixer, ‘mcmx’, is an input to it?
See that the sampler, ‘samp’, is connected to the mixer?
See that the formats are all the same?
The processing graph looks ok. Right?

But then….

So, “required condition is false: outputNode”.
BTW, not to be a grammar nazi, but where is the predicate in that sentence? outputNode what? It’s nil? It’s not there? It’s drunk? outputNode what?

I see the node in the graph. So, what’s the problem?

I have no idea. There is no place to look either.

I’ve tried loading the soundbank – or not.

I’ve set up the session for playback – or not.
I’ve tried with the engine running – or not.
I’ve tried with different MIDI files.
I’ve tried just connecting the sampler to the outputNode. No luck. Shouldn’t have to do that anyway.

Bleah.

AVMusicTrack

Table of Contents

Ok, let’s try the spiffy new AVMusicTrack which is so full of grooviosity that we can retire the old worn out MusicTrack from the AudioToolbox.

Right. No such luck.

I see it defined right there in AVAudioSequencer.h.
Yes, the frameworks are in the project.

Show stopper.

I see that AVAudioSequencer’s definition is preceded by the available macro, but AVMusicTrack doesn’t have that.
Is that the problem?
I’m guessing and do not have the access to try it.

So, once that’s fixed, is there an easy API to add/remove all kinds of MIDI events? You know, channel messages, sysex, controllers, metatext etc.?

Nope.

Nothing like that.
Mute, solo, length, set a destination, looping.

Sigh.

Update

Table of Contents
In beta 3 it finally stopped crashing. The Github project runs.
The API for AVMusicTrack is still a waste of time though.

Summary

Table of Contents

So now we can do away with the AudioToolbox MusicSequence?
Nope.

So now we can do away with the AudioToolbox MusicTrack?
Nope.

So now we can connect to an AVAudioEngine without messing around with AudioUnits?
Nope.

So can we code a simple hello world with this API?
Nope.

Now that’s what I call progress.

Resources

Table of Contents

20 thoughts on “Swift 2 AVAudioSequencer”

  1. Hi Gene, a few months ago I wrapped in Swift the four AudioToolbox classes for necessary MIDI sequencing. Here’s a link to it on GitHub.

    https://github.com/thomjordan/MidiToolbox

    It doesn’t yet include a way to connect to MIDI devices and endpoints. For this I currently use a version of the VVMIDI framework which I made some minor changes to. It may still work directly with the current version of VVMIDI from vvopensource:

    https://github.com/mrRay/vvopensource

    If it turns out you or anyone tries it and it doesn’t, let me know by replying here and I can publish a fork of VVMIDI with my changes. I basically just added a timestamp field to the VVMIDIMessage class, and modified a few calls that use the class.

    I’ve considered additionally wrapping a few classes from CoreMIDI that should provide the MIDI connections functionality, and adding it to MidiToolbox. It’s been low on my to-do list, since I’m currently using an approach that works. I might end up adding it if it looks like Apple is not going to do something like it first. I thought this was the case when I found this blog post, but it doesn’t surprise me really that it doesn’t yet work right.

    1. BTW I just checked the CoreMIDI docs again now, and there’s some minor Swift info there, but seemingly not as complete as what’s usually included for the majority of the Cocoa API. There may not be a need to truly “wrap” any CoreMIDI functionality in Swift, although a more straightforward approach using supplemental code could be useful, especially as part of “MidiToolbox”. It remains to be seen..hopefully soon none of this will be needed..

  2. Hi,

    I’m in the process of writing a MIDI based music game and your blog entries helped me quite a few times when scouring the net for tips when dealing with the CoreAudio/CoreMIDI and AudioToolbox frameworks and their related errors (ugh).
    So as you did, when I discovered the new AVAudioSequencer/Engine and AVMusicTrack promises I decided to try them out… And it didn’t work.
    I first got the same error as you got: AVAudioEngineGraph.mm:3649: GetDefaultMusicDevice: required condition is false: outputNode.
    Then while rearranging my class I got a second one when trying to attach an AVAudioUnitSampler to the engine, which confused me even more: AVAudioEngine.mm:275: AttachNode: required condition is false: !nodeimpl->HasEngineImpl().

    I finally managed to make the whole thing work and even play midi files with a SoundBank. But only by declaring the engine, sequencer and any related functions at global level.

    If you have any insights on why this works, I’d be delighted to know!

    Thanks!

    1. The problem I had was fixed in a later beta. Does updating Xcode fix your problem?
      If not, let me see your code. Maybe I can spot the problem.

  3. Howdy Gene
    Really appreciate your sample code and head-first dive into this new technology that doesn’t seem so well documented yet.

    I found (and fixed) a crash you were having.
    When you do…
    track.destinationAudioUnit = self.sampler //this crashes
    …that’s only because you started the sequence already. If you do the sequence start AFTER the track setup/diagnosis loop, it works fine.
    Also, I found this is a helpful thing to set on the tracks:
    track.loopRange = AVBeatRange(start: 0,length: 4)
    …or whatever length you want. Right now you’ve got some interesting polyrhtyhms going on ;]

    Thanks again.

    1. Also, this crash:

      // this crashes
      //print("track timeResolution \(track.timeResolution)")

      …seems to be because you can only do this on the tempo track.


      let tempo = sequencer.tempoTrack
      print("tempo lengthInBeats \(tempo.lengthInBeats)")
      print("tempo lengthInBeats \(tempo.timeResolution)")

    2. Sorry for the delay in approving your comment. I was cleaning up damage (and hardening the site) due to a crack attack. I don’t know why people think this is funny – it justs wastes the time of small-time guys like me.

      Thanks for finding the destinationAudioUnit problem. Most of the time it’s an Apple (lack of) documentation problem, and sometimes it’s a PBCAC 🙂

  4. Have you seen any more improvements in AVAudioSequencer? Is there anyway to control playback tempo ?

    Thanks.

    1. You can set the rate property.

  5. Is there a way to clear the sequence and load a different midi file with AVAudioSequencer? For instance having a user play different midi files with AVAudioSequence?

    Thanks.

  6. Hi Gene,

    Thanks so much for your blog and for sharing – it really has been so useful when trying to work with Core Audio. I have bought you some well earned pet food!

    One question for you if you have time – I think I know the answer. Is there any way yet to manipulate a midi sequence running in an AVAudioSequencer once it is playing? Say I have a four bar looped midi track running, which I can create load and loop fine, once I start the sequencer is that it? Or is there any method that whilst it is playing I can delete or add a note.

    Thanks!

    1. AVAudioSequencer really doesn’t have this. I’ve filed a Radar and have spoken to the guy responsible (we’ve been friends for 30 years). If someone else files a Radar, it will be moved “up the queue”.

      I’ve pretty much just done this via the old AudioToolkit’s MusicSequencer. Clunky, but it works.

  7. Hmm – well I can see looking at the source that the AudioKit guys seem to have worked out a way, so feel free not to release my comments!

    1. AudioKit has a lot of my MIDI code – but not with AVAudioSequencer. Unless someone added that? What code are you using?

      1. Yes you are right – it’s actually MusicSequencer at work, I was mistaken. I’be got what I needed going with that now thank you, amd will file a Radar. It’s quite a mess isn’t it?!

  8. Hi Gene. Thanks for these posts – they are still a huge help, even several years later.

    I’m also experiencing the “required condition is false: outputNode” error when trying to produce something similar; however, mine is happening when performing a segue back to any other view controller, apparently during the deinit process. Is this something you’ve encountered before?

    Thank you.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.