Swift: AUGraph and MusicSequence

Swift Language

Swift AUGraph and MusicSequence

The AudioToolbox MusicSequence remains the only way to create a MIDI Sequence programmatically. The AVFoundation class AVMIDIPlayer will play a MIDI file, but not a MusicSequence.

AVAudioEngine has a musicSequence property. It doesn’t seem to do anything yet (except crash when you set it). So the way to get a MusicSequence to play with instrument sounds is to create a low level core audio AUGraph and play the sequence with a MusicPlayer.

Introduction

Apple is moving towards a higher level Audio API with AVFoundation. The AVAudioEngine looks promising, but it is incomplete. Right now there isn’t a way to associate an AudioToolbox MusicSequence with it. So, here I’ll use a low level Core Audio AUGraph for the sounds.

Table of Contents

Create a MusicSequence

Let’s start by creating a MusicSequence with a MusicTrack that contains several MIDINoteMessages.

Table of Contents

MusicPlayer create

Now you need a MusicPlayer to hear it. Let’s make one give it out MusicSequence.
Here, I “pre roll” the player for fast startup when you hit a play button. You don’t have to do this,
but here is the way to do it.

Table of Contents

Playing a MusicSequence

Finally, you tell the player to play like this – probably from an IBAction.

Wonderful sine waves! What if you want to hear something that approximates actual instruments?

Well, you can load SoundFont or DLS banks – or even individual sound files. Here, I’ll load a SoundFont.
Load it into what? Well, here I’ll load it into a core audio sampler – an AudioUnit. That means I’ll need to create a core audio AUGraph.

The end of the story is this, you associate an AUGraph with the MusicSequence like this.

Table of Contents

Create an AUGraph

Great. So how do you make an AUGraph? If you want a bit more detail, look at my blog post on it using Objective-C. Here, I’ll just outline the steps.

Create the AUGraph with NewAUGraph. It is useful to define it as an instance variable.

Table of Contents

Create sampler

To create the sampler and add it to the graph, you need to create an AudioComponentDescription.

Table of Contents

Create IO node

Create an output node in the same manner.

Table of Contents

Obtain Audio Units

Now to wire the nodes together and init the AudioUnits. The graph needs to be open, so we do that first.
Then I obtain references to the audio units with the function AUGraphNodeInfo.

Table of Contents

Wiring

Now wire them using AUGraphConnectNodeInput.

Table of Contents

Starting the AUGraph

Now you can initialize and start the graph.

Table of Contents

Soundfont

Go ahead and play your MusicSequence now. Crap. Sine waves again. Well yeah, we didn’t load any sounds!

Let’s create a function to load a SoundFont, then use a “preset” from that font on the sampler unit. You need to fill out a AUSamplerInstrumentData struct. One thing that may trip you up is the fileURL which is an Unmanaged CFURL. Well, NSURL is automatically toll-free-bridged to CFURL. Cool. But it is not Unmanaged, which is what is required. So, here I’m using Unmanaged.passUnretained. If you know a better way, please let me know.

Then we need to set the kAUSamplerProperty_LoadInstrument on our samplerUnit. You do that with AudioUnitSetProperty. The preset numbers are General MIDI patch numbers. In the Github repo, I created a Dictionary of patches for ease of use and an example Picker.

Table of Contents

Summary

You can create a Core Audio AUGraph, attach it to a MusicSequence, and play it.

Resources

29 thoughts on “Swift: AUGraph and MusicSequence”

  1. Hi,
    Great post,only problem i have, the code plays a sine wave.
    The project works fine as a standalone, however when i copy into my project , i only hear sine wave.
    It appears the AUgraph is not being used.
    Could there be something i haven’t copied over from the original files?
    Thank you
    Z

  2. Me Again.
    Hi Gene De Lisa,
    I have a midi music sequence created programmatically, how would i go about saving or exporting
    the midi file to an iDevice ? For example as a ringtone, preferably with the particular instrument i have selected.
    Any hints or advice is of course very much appreciated.
    Z

  3. I’m finding the following is still the case:

    AVAudioEngine has a musicSequence property. It doesn’t seem to do anything yet (except crash when you set it)

    have you found otherwise yet?

  4. Thank you for the excellent and very helpful guide. I really appreciate it.

    I have a question.

    I’m using an audio file for the instrument and wanted to know if there is a way to change the start time of that instrument. I can currently change the end time by modifying the duration of the midi note message, but I can’t find a way to start playing the audio file, say, 5 seconds from beginning of the file.

    I understand also that every event with this instrument will have this same start time (unless its possible to do it on an event bases).

    I found another of your guides about trimming a sound file which also could be an option but that seems like a round about way.

    Appreciate any advice
    Thanks again!

    1. Might be more trouble than it’s worth (but that’s up to you to judge) but play around with aupresets via AU Lab.
      Here is some info.
      Let me know if that works for you.

      1. Thank you so much for the quick reply.

        I checked out aupresets and it appears its really only useful for assigning audio files to ranges of pitches for one instrument. So nothing there to change start time. I consider that the answer is to use the trim guide. Which I may have questions for you on lol.

        Thanks again!

        1. Oh gee, sorry. Worth a shot. Thanks for checking it out.
          I guess you’ll just have to fire up Audacity and trim them.

          1. Just wanted to come back and add my current solution for this.
            I really appreciate your guides because the documentation on this stuff is not very helpful at times.

            So I ended up using a mix of AUGraph/MusicPlayer and AVAudioEngine

            In AVAudioEngine you can hook up an AVAudioPlayerNode which has a method:


            scheduleSegment:startingFrame:frameCount:atTime:completionHandler:

            here you can provide a starting frame.

            But as your guide points out I can’t play a sequence with AVAudioEngine so I still use AudioToolBox to play the sequence. When the midi notes want to play tho, if i’m using a single audio file, I use AudioEngine instead to play said file.

            This way the user can adjust on the fly without having to export an entire new file as well as use the single file multiple times with different settings (start times for example).

            Hope that makes sense and/or helps someone. =)

  5. I’m not sure if I am the only one experiencing this, but MusicSequences that loop fine in iOS 8 seems to stop looping on iOS 9 (I have several MusicSequence projects, but I’ve also tried out this sample code). Have you noticed this also?

    1. The Github project for this post has a slider that controls the loop – even though I didn’t talk about looping here. I just tried it and it seems to work.

      If you have your code in a public repo someplace send me the link and I’ll take a look.

      1. Hey Gene,

        I have a similar issue as Rosano. As of iOS 9 my MIDI app stopped working. In iOS 7 and 8 it worked great. I spent all day today trying to figure out what’s going. At this point, I’m convinced that it’s an iOS 9 bug. And since you said you don’t experience it, wondering if you could run this code and see if you are experiencing it too:

        https://gist.github.com/Nikolozi/d001c11533e689587809

        Basically, if I run this code as is iOS will start playing it using its internal synth (since MIDI endpoint is not set), then I hear a few notes playing with crackling sounds and then stops. Doesn’t play all notes or tries to loop (which it should). If I add the endpoint code bit back in and run it while having a synth app in a background audio mode. Again, it will try to play a few notes and then synth gets stuck on holding down multiple notes at once (quite different from the programmed sequence).

        I remember very similar bug in 64-bit iPhones back in 2013, but it was resolved soon after: http://lists.apple.com/archives/coreaudio-api/2013/Oct/msg00012.html

        I’m worried that this bug is back. The issue is happening in iOS 9.1 beta 2 also.

        If you have a chance to try it out that be great.

        Cheers
        Niko

          1. oh man. That exactly the bug. I’m on the CA mailing list as well, wonder how I missed that.

            Sadly, the bug doesn’t seem to be fixed on iOS 9.0.1 or 9.2b2.

            To answer your question. I ran your code, in 3 different modes. 1) As is, 2) with no endpoint (i.e. sine wave) 3) Core MIDI sending messages to a background synth app. In all 3 cases they don’t loop and usually get stuck on one of the notes.

            I’ve tested it on my iPhone 6, iPad Air 2 and 5th gen iPod Touch. All have the same issue.

            It’s interesting that you are not seeing the issue on 4s. I wonder what the difference is between that and iPod Touch 5th gen. They use the same CPU.

            I’ll try to post a reply to that CA mailing list post to see if Apple will respond with an update on the issue. As much as I’m up for a challenge of implementing the MIDI sequence player myself I don’t have much time atm.

            And thanks for your prompt response.

          2. Just updated to 9.0.1 and tried it again. I get 2 only notes played. Then a freeze.
            The good news is that is the same thing you’re getting. The bad thing is that it confirms the bug.

            fwiw., The 8.4 simulator works.
            The 9.0 simulator works a bit better than the devices, but freezes when you set the loop point.

            I’ll ask Doug what the deal is.

          3. Okay, I filed a bug report and sent them a video and source code (before I saw your archive link..) – I put it on Dropbox if you still want to check it out . My issue happened on iPhone 5s with 9.0.1, as well as all simulators running iOS 9.

            Hopefully this will be gone in 9.1!

  6. Hi Gene,

    Thank you so much for the comprehensive tutorial! I’ve successfully created a sequence using my own array of notes/velocities, etc via MIDINoteMessage. I was wondering how can I embed some controller message(like CC1 Modulation, CC7 Volume) into the sequence? Is there any structure in AudioToolbox can let me do it easily?

    Thanks again.
    Rex

    1. I think I sort of find the answer, now what I’m doing is to use MIDIChannelMessage to write the controller data in to the MusicTrack, here’s the code I’ve written to write in a CC1 controller message.

      My problem seems solved, but is it the best way to do it?
      Thanks!

  7. How do you disable the default Synth? I am using the MidiOutput callback to send the note data somewhere else, but I can’t seem to disable the default synth sound that is being generated from the musicplayer(?)

    1. The line that connects the MusicSequence to the AUGraph is this one:

  8. When I do a NoteOn to channel 9 (MIDI Channel 10), I’m expecting a drum sound. However, a melodic sound plays. What can I do to make drum notes play?

    1. Did you send MIDIChannelMessages (bank msb and lsb) to select the bank, then a program change (0xC0 + channel) to select the patch?

  9. Hi, thanks for the tutorial, its has been very useful given the incomplete and messy apple documentation.

    I successfully create sequences and put the sampler to put an instrument over the midi, but all channels sounds with the same instrument. I do not know if is possible to put a different sound to each channel in the MIDI.

    Thanks in advance

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.