Category

Swift 2 and CoreMIDI

Swift Language

Swift 2 and CoreMIDI

Swift 1.2 is notorious regarding its lack of function pointers to deal with C APIs such as Core MIDI. I’ve complained^H^H^H^H^H written about this several times.

Introduction

Well, praise Zeus, Thor, and the FSM, in Swift 2 beta, Core MIDI is easier to use without function pointers or trampolines.

MIDI Client

Table of Contents

The first step in using CoreMIDI is to create a MIDIClient reference. To do that, you use MIDIClientCreate() which takes a function pointer among other parameters. That is the problem. Swift 1.2 barfs on C function pointers.

The solution in Swift 2 (beta) is the introduction of MIDIClientCreateWithBlock() which will take a closure of type MIDINotifyBlock. This is quite similar to the Objective-C trampoline (but without the Objective-C). Take a look at the naming in my old trampoline code and tell me I’m not a mind reader 🙂

Groovy.
Well, sort of.
What are you going to do with that MIDINotification?

The different types of notifications are different unrelated structures.

You cannot downcast these via as?

Previously, I simply used unsafeBitCast. That does not work anymore directly. You need to use withUnsafePointer like this:

Yeah, I know.
Gross.
But it works.

By the way, MIDIClientCreate has been updated too as well as the definitions of the callbacks.

Input Port

Table of Contents

Similarly, MIDIInputPortCreate() which also takes a function pointer (to a read proc) has been upgraded to MIDIInputPortCreateWithBlock() which will take a MIDIReadBlock.

And the read block:

As you can see, iterating trough the packets has not been changed much. Swift does have MIDIPacketNext now though. It was MIA previously.

Personally, I think that if it were a SequenceType it would be much easier. You could do this:

Dream on.

Instead, you have to do some pointer nonsense. And the packets are a tuple, not an array. This example is one way to navigate the list. If you have something more elegant, let me know.

MIDIInputPortCreate() and the read function have also been updated in Swift 2 beta.

Virtual Destination

Table of Contents

To create a Virtual Destination, there is now MIDIDestinationCreateWithBlock() which also takes a MIDIReadBlock.

Remember that you have to edit your Info.plist and add the key “Required Background Modes” with the value set to audio (It will say “App plays audio or streams audio/video using AirPlay”) to create virtual destinations.

Summary

Table of Contents

Several new CoreMIDI functions have been added to Swift 2 beta. Here they are:

  • MIDIClientCreateWithBlock()
  • MIDIInputPortCreateWithBlock()
  • MIDIDestinationCreateWithBlock()
  • MIDIReadBlock
  • MIDINotifyBlock
  • MIDIPacketNext()

As always, there is my complete working example project on Github.

Have fun.

Resources

Table of Contents

21 thoughts on “Swift 2 and CoreMIDI”

  1. Hi Gene,

    thank you for the informative articles regarding MIDI and swift. I am very new to OS X programming and Swift, and was going through your website trying to put together the pieces of the puzzle. I wanted to see if I could write a little playground test to read input from my midi keyboard. The code I have written below however seems to have trouble connecting the source to the input port (MIDIPortConnectSource returns status -50).

    I also noticed that the “*WithBlock” functions are only available in OS X 10.11, I was not able to use them until I installed the beta.

    Below is my short playground, if you or anyone else has any suggestions they would be appreciated.

  2. …there was only one device plugged in I should have known this…

    changing the hardcoded source from 1 to 0 solved the problem:
    var midiEndpoint = MIDIGetSource(0)

    I am now able to receive packets.

    thanks again for all your great articles

    1. In real life you’d use something like NSPopUpButtons to present the user with a menu of sources/destinations to connect.

      Glad you got your example working!

  3. Gene, many thanks on your notes and project. Very helpful. One thing I found. If you have a complex MIDI message coming back in the read block (say sysex) and don’t want the tuples, you can do:

  4. Hi Gene,

    I was wondering if you have been able to parse out the values of the data in MIDIMetaEvent? I am only able to get the numerator of a time signature, but I desperately need the rest of the data for a project I am working on. It says that data is of the type (UInt8). As best I can tell, that means that its supposed to be a tuple, but nothing I do seems to imply I can access other values. The dataLength is correct and returns ’04’, suggesting the four bytes for a time signature event. Additionally CAShow displays the proper time signature.

    1. Wouldn’t that be nice! MIDIMetaEvent is a mess with that tuple nonsense.
      You can use Mirror on the data, iterate over it, and copy it into an array.
      But what a PITA. Apple needs to update this.

      1. By the way, if you happen to know where I can find an example of how this is done, it would be very helpful. I have noticed that it is also an issue with getting the instrument and track names.

  5. Hi Gene,
    This is a pretty fantastic post for me, and now my app can receive note from my keyboard(via USB camera adapter) and everything seems fine.
    However, now I want to sent note to my keyboard device from my iPad, is that possible? my keyboard has a separate speaker so it can play the sound my itself.
    Any information will be appreciate.
    Thanks!

    1. Thanks, I read the official document for some times and get the solution.
      Finally I init a new midi client and use
      func MIDISend(port: MIDIPortRef, _ dest: MIDIEndpointRef, _ pktlist: UnsafePointer) -> OSStatus
      to sent data to my destination port
      everything looks good now~
      Best wishes
      By Cater

  6. Hey Gene, thanks for the great posts.
    I am running Swift 2.1.1 and I tried out the UnsafePointer code you have above for casting MIDINotification to
    MIDIObjectAddRemoveNotification. While there were no errors the fields beyond the messageID were invalid.

    The code below works in Swift 2.1.1 …

    MIDIClientCreate("MidiTestClient", TestMIDINotifyProc,nil, &midiClient);


    func TestMIDINotifyProc(message: UnsafePointer, refCon: UnsafeMutablePointer) -> Void
    {
    let notification:MIDINotification = message.memory;

    if (notification.messageID == .MsgObjectAdded || notification.messageID == .MsgObjectRemoved)
    {
    let msgPtr:UnsafePointer = UnsafePointer(message);
    let changeMsg:MIDIObjectAddRemoveNotification = msgPtr.memory;
    }
    }

    1. Sorry I did put code tags around the code but the template characters were stripped anyway.
      All of the UnsafePointer code in the example should have had the types in them. i.e. MIDINotification and MIDIObjectAddRemoveNotification.

      Regards,
      Matt

      1. Gotta love WordPress. Maybe the pre tag is the way to go.

        Let me try it:

          1. OK here it is with pre tags

            Thanks again.

  7. I wrote a simple MIDI app that is also supposed to capture MIDI while running in the background.
    However, just adding the „audio“ key to background modes in Info.plist doesn’t seem to do it in iOS 11.
    When I put the app into background by pressing the Home button, the midiReadProc callback is not invoked anymore when I send MIDI.
    I also tried to start an AVAudioSession and start an AVAudioEngine with a dummy AVAudioUnitSampler but no luck.
    Any helpful hints on this?

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.