Swift and AVMIDIPlayer

Swift Language

Swift and AVMIDIPlayer

How to play MIDI data via the AVFoundation AVMIDIPlayer.

Introduction

Previously, I wrote about attaching a low level core audio AUGraph to a MusicSequence to hear something besides sine waves when played via a MusicPlayer. Here, I’ll show you how to use the new higher level AVMIDIPlayer. You can even play a MusicSequence by sticking your elbow in your ear.

Playing a MIDI File

Preparing an AVMIDIPlayer to play a standard MIDI file with a SoundFont or DLS file is fairly straightforward. Get both NSURLs from your bundle, then pass them into the init function.

Note that I’m passing nil to the play function. It expects a completion function. It will crash if you pass in either a function or a closure. My workaround is to pass nil.

Play your MIDI file in the simulator, and you’ll hear sine waves. Huh? A valid SoundFont was sent to the init function, and you hear sine waves? Yeah. After you spend a day verifying that your code is correct, install iOS8 on your actual device and try it there. Yup, it works. Nice.

ps. that slider thing is just some eye candy in the final project. A UISlider moves while playing.

Table of Contents

Playing NSData from a file

AVMIDIPlayer has an init function that takes an NSData instance instead of a URL. So, let’s try creating an NSData object from the URL and a simple first step.

Not surprisingly, that works. But why would you want to do this?

Table of Contents

Playing a MusicSequence

The hoary grizzled MusicSequence from the AudioToolbox is still the only way to create a MIDI Sequence on the fly. If you have an app where the user taps in notes, you can store them in a MusicSequence for example. But AVMIDIPlayer has no init function that takes a MusicSequence. Our choices are an NSURL or NSData.

A NSURL doesn’t make sense, but what about NSData? Can you turn a MusicSequence into NSData? Well, there’s MusicSequenceFileCreateData(). With this function, you can pass in a data variable that will be initialized to the data that would be written to a standard MIDI file. You can then use that NSData in the player code in our previous example.

I haven’t checked to see if there is a memory leak with the takeUnretainedValue call. I’ll check that out next.

update: I checked and there is indeed a small memory leak.
The docs for MusicSequenceFileCreateData say that the caller is responsible for releasing the CFData. So OK, takeUnretainedValue is the right one. So I tried saving the data variable as an ivar, checking for nil when playing again, then calling release(). Crash. What about DisposeMusicSequence? OK, I tried saving the sequence as an ivar and calling that. No crash, but memory still leaks. CFRelease is simply unavailable.

What do you think? Advice?

Table of Contents

Swift 2 update

Here is the updated syntax for creating an AVMIDIPlayer from a MusicSequence.

And the new syntax for creating one from a MIDI file.

And the completion handler now works.

Table of Contents

Summary

So you can play a MusicSequence with sounds via an AVMIDIPlayer. You just need to know the secret handshake.

Resources

15 thoughts on “Swift and AVMIDIPlayer”

  1. Hi Gene,

    Thanks for this post. Were you able to load the sound bank file? I’ve tried both sf2 and dls with no effect.

    Phil

  2. Re-read the post and tried it on the device :-/ I had audio on the sim, but on the device the sound font is used. Thanks!

  3. Hi,

    func takeUnretainedValue() -> T
    Get the value of this unmanaged reference as a managed reference without consuming an unbalanced retain of it.

    This is useful when a function returns an unmanaged reference and you know that you’re not responsible for releasing the result.

    func takeRetainedValue() -> T
    Get the value of this unmanaged reference as a managed reference and consume an unbalanced retain of it.

    This is useful when a function returns an unmanaged reference and you know that you’re responsible for releasing the result.

    [source: from http://swiftdoc.org/type/Unmanaged/ ]

    Good Luck

  4. Hi Gene,

    Great post! Thanks for sharing the knowledge you’ve acquired. As I understand it AVMidiPlayer works with a single SoundFont file. What would you use if you wanted each midi file to have its own SoundFont sound? Would you employ multiple AVMidiPlayers and set them to play at the same time?

  5. Hi Gene,

    I wrote an app/game to teach my high school music production students how to play chords, bass lines, etc so I’m familiar with how to use CoreMidi. Now, I’m trying to write an app teach them how to match rhythmic patterns in drums (and other instruments) somewhat like RockBand, if you’re familiar with the franchise.

    For the new app, I’ve seen some code on that demonstrate how to parse a midi file. Let’s say I have a single track in my midi sequence and I want the user to play it correctly in time. Would you know what the general idea behind this is? Would I need some sort of timer to figure out when notes are to be played based on parsed data? I know the midi file can be played, I just need to make sure the user is playing notes at the correct time.

  6. Hi Gene, thanks so much for this blog

    Hi I’ve been trying to rewrite seqToData in swift 3 to use Data instead of NSData with no luck
    Did you succeed?

    1. This works in Swift 3. Not much of a difference.

      func sequenceData(_ musicSequence:MusicSequence) -> Data? {

      log.debug("creating data for sequence \(musicSequence)")
      CAShow(UnsafeMutablePointer(musicSequence))

      var data:Unmanaged?
      let status = MusicSequenceFileCreateData(musicSequence,
      MusicSequenceFileTypeID.midiType,
      MusicSequenceFileFlags.eraseFile,
      480, &data)
      if status != noErr {
      log.error("error turning MusicSequence into Data")
      AudioUtils.checkError(status)
      return nil
      }

      let ns:Data = data!.takeUnretainedValue() as Data
      data?.release()
      return ns
      }

  7. hello! I followed your instructions with avmidiplayer created with sequence but there was no way to set or change the instrument. i only hear sinus sound.

    i did send
    MIDIChannelMessage(status: 0xB0, data1: 0, data2: 0, reserved: 0)
    MIDIChannelMessage(status: 0xB0, data1: UInt8(32), data2: 0, reserved: 0)
    chanmess = MIDIChannelMessage(status: 0xC0, data1: UInt8(46), data2: 0, reserved: 0)
    but without any success

  8. Hi ist me again!

    My problem is to set a second instrument(and third) on next track in a sequence.
    The second instrument does not sound like it should.

    Is data1 the right place to set (0xb0)msb,lsb and (0xc0)program of a sound bank instrument?
    Why the first instrument sounds good (also the second if i swap it to the first track) but the second not?

    I can’t find a solution in GitHub code examples.

    here is my code:
    // set MSB to 0x00
    var status = UInt8(0xB0 + trackNum)
    var inMessage = MIDIChannelMessage(status: status, data1: UInt8(instrument.MSB), data2: 0x00, reserved: 0)
    MusicTrackNewMIDIChannelEvent(musicTrack!, 0, &inMessage)

    // set LSB to 0x20
    status = UInt8(0xB0 + trackNum)
    inMessage = MIDIChannelMessage(status: status, data1: UInt8(instrument.LSB), data2: 0x20, reserved: 0)
    MusicTrackNewMIDIChannelEvent(musicTrack!, 0, &inMessage)

    // change program
    status = UInt8(0xC0 + trackNum)
    inMessage = MIDIChannelMessage(status: status, data1: UInt8(instrument.program), data2: 0, reserved: 0)
    MusicTrackNewMIDIChannelEvent(musicTrack!, 0, &inMessage)

Leave a Reply to Gene De Lisa Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.