Swift 1.2 beta2 and CoreMIDI

Swift Language

Swift 1.2 beta2 and CoreMIDI

Someone is actually working on CoreMIDI with Swift.

The Problem

Prior to Swift 1.2, various CoreMIDI types defined in MIDIServices.h were architecture dependent. Like this:

#if __LP64__
        typedef UInt32 MIDIObjectRef;
        typedef MIDIObjectRef MIDIClientRef;
#else
        typedef void *                                  MIDIObjectRef;
        typedef struct OpaqueMIDIClient *               MIDIClientRef;
        typedef struct OpaqueMIDIPort *                 MIDIPortRef;
        typedef struct OpaqueMIDIDevice *               MIDIDeviceRef;
        typedef struct OpaqueMIDIEntity *               MIDIEntityRef;
        typedef struct OpaqueMIDIEndpoint *             MIDIEndpointRef;
#endif

This mean that you had to deal with nonsense like this:

#if __LP64__
var client = MIDIClientRef()
status = MIDIClientCreate(s,
      MIDINotifyProc( COpaquePointer( [ MyMIDINotifyProc ] ) ),
      nil,
      &client)
 
#else
      //FIXME: abandon all hope, ye who enter here

I blogged about the problem with creating a MIDI client ref in a 32-bit architecture. (Hint: don’t try).

Now in Swift 1.2 beta wer get this:

typealias MIDIObjectRef = UInt32
typealias MIDIClientRef = MIDIObjectRef
typealias MIDIPortRef = MIDIObjectRef
typealias MIDIDeviceRef = MIDIObjectRef
typealias MIDIEntityRef = MIDIObjectRef
typealias MIDIEndpointRef = MIDIObjectRef
typealias MIDITimeStamp = UInt64

Those Opaque structs are now history. Yay.

But…

Let’s look at the MIDI Services reference.

Here’s a gem. Note the return type.

func MIDIGetNumberOfSources() -> ItemCount

You would use this like so:

var sourceCount = MIDIGetNumberOfSources()
for var i:ItemCount = 0; i < sourceCount; i++ {
    var endpoint = MIDIGetSource(i)
etc.

ItemCount is defined in MacTypes. Or it was until now. Right now there is a commment for it but no definition.
D’oh!

Workaround. Go ahead and define it yourself as it was pre-Swift 1.2.

typealias ItemCount = UInt

Or just don’t use it.

for srcIndex in 0 ..< sourceCount {
   let mep = MIDIGetSource(srcIndex)
etc.

Table of Contents

Summary

There is progress with using CoreMIDI from Swift.
There are still potholes though.

Resources

Posted in Core MIDI, Swift | Tagged | Leave a comment

Swift Framework creation

Swift Language

Swift Framework creation

This is my preferred way to set up a Swift framework project.

Introduction

Sometimes, you will find yourself using the same custom code in multiple projects. Simply copying the files into a new project works, but then you have different versions. So, a better way to reuse that code is to create a Framework.

This is my preferred way.There are other ways that The Google will show you. I had luck with none of them – especially at link time. One example has you dragging the framework from the simulator’s temp directory via the Finder. Yikes.

Without a workspace, iterative development on the framework and a potential client of it will be awkward. So, I suggest creating a workspace and adding the framework project to it along with at least one app project that uses the framework.

Table of Contents

Creating the projects

You will create a framework, a Single Page App that uses it, and a workspace to make your life easier.

Create a Single Page App.

Close it.

Create a Framework.

createFramework
Close it.

Create a Workspace.


Choose Add Files (⌥ ⌘-A or right click or choose from the File Menu).
Add the .xcodeproj files for your framework and your app to the workspace.

If you didn’t close your framework project or your app, it will not be added correctly. You will see just your .xcodeproj file and not the project.

You should see something like this in your project navigator.

frameworkProjectNavigator

Table of Contents

Building

Click on your Framework project.
For the scheme, make certain that your framework is selected and then choose IOS Device.
Like this:
frameworkScheme

Build the framework. (⌘-B)
Open up Products and MyFrameework.framework should not be red anymore but black.

Now select your App and choose the General tab for the app target.
Drag MyFrameework.framework to “Embedded Binaries”
embeddedLibraries
Notice that Linked frameworks and Libraries also picks this up.

You’re ready!

Table of Contents

A Framework Class

Create a class in your framework. Be certain that the class and the function your will be calling are both public. If you define init functions, then they also need to be public.

public class MyClass {
    public init(){}
 
    public func yo() {
        println("Yo")
    }
}

Build (⌘-B)

Table of Contents

Try it out

For a simple test, open your view controller. You need to import your framework. If you named it MyFramework, simply specify MyFramework.

import MyFramework
 
class ViewController: UIViewController {
 
    override func viewDidLoad() {
        super.viewDidLoad()
        // here is your framework class.
        var m = MyClass()
    }
etc.

Now use your class from the framework. When you build, check the scheme to make sure you’re building the app and not the framework!

It’s fairly simple to go back to the framework, add classes and modify them, build, then go back to the app and try them out.

Table of Contents

Summary

Create a framework, an app that uses it, then add both projects to a workspace.
Build the framework, then drag the framework product to “Embedded Binaries” on the general tab of your app’s target.

Resources

Posted in Swift | Tagged , , | Leave a comment

Swift fail: MIDIClientCreate

Swift Language

Swift fail: MIDIClientCreate

There is a problem with calling Core MIDI’s MIDIClientCreate function from Swift.

Introduction

Let’s start with a simple call to Core MIDI’s client create function. You need the midi client to create MIDI input and output ports.

func midi() {
    var status = OSStatus(noErr)
    var s:CFString = "MyClient"
 
    var client = MIDIClientRef()
    status = MIDIClientCreate(s,
        MIDINotifyProc( COpaquePointer( [ MyMIDINotifyProc ] ) ),
        nil,
        &client)
    if status == OSStatus(noErr) {
        println("created client")
    } else {
        println("error creating client : \(status)")
    }
// etc
}
 
func MyMIDINotifyProc (np:UnsafePointer<MIDINotification>, refCon:UnsafeMutablePointer<Void>) {
        var notification = np.memory
        println("MIDI Notify, messageId= \(notification.messageID)")
//etc
}

Works great!

Table of Contents

Problem

So, what’s the problem?

The above code compiled just fine when the scheme was an iPhone 6. I then plugged in my iPhone 4s and the problem raised its ugly head. If you don’t have an older iOS device, just select the scheme in XCode.

To verify that this was the problem I tried checking the arch and then calling separate init methods. The initial code for both was what you see in the first example here.

 
// The iPhone 4S has a 32 bit 1 GHz dual-core Apple A5 processor and 512 MB of RAM
// The iPhone 5S has a 64 bit 1.3 GHz dual-core Apple A7 processor and 1 GB of RAM
#if arch(arm64) || arch(x86_64) // >= iPhone 5
    init64()
#else // < iPhone 5
    init32()
#endif

XCode will give you this love letter for 32 bit devices. This refers to the line where you create the client variable. (var client = MIDIClientRef())

'MIDIClientRef' cannot be constructed because it has no accessible initializers

Ok, just do this then.

var client:MIDIClientRef

Nope.

'MIDIClientRef' is not identical to 'Unmanaged?'

Ok, then

var client : Unmanaged<MIDIClientRef>? = nil

Works!

Go back to 64 bits.
Problem.

Type 'MIDIClientRef' does not conform to protocol 'AnyObject'

[expletive deleted]

Here are the definitions in CoreMIDI/MIDIServices.h

typealias MIDIObjectRef = UnsafeMutablePointer<Void>
typealias MIDIClientRef = MIDIObjectRef

Well, actually in Objective-C:

#if __LP64__
 
typedef UInt32 MIDIObjectRef;
typedef MIDIObjectRef MIDIClientRef;
typedef MIDIObjectRef MIDIPortRef;
typedef MIDIObjectRef MIDIDeviceRef;
typedef MIDIObjectRef MIDIEntityRef;
typedef MIDIObjectRef MIDIEndpointRef;
 
#else
 
typedef void * MIDIObjectRef;
typedef struct OpaqueMIDIClient *		MIDIClientRef;
typedef struct OpaqueMIDIPort *			MIDIPortRef;
typedef struct OpaqueMIDIDevice *		MIDIDeviceRef;
typedef struct OpaqueMIDIEntity *		MIDIEntityRef;
typedef struct OpaqueMIDIEndpoint *		MIDIEndpointRef;
#endif

Suggestions?

Table of Contents

Summary

You can’t create a MIDI client on older iOS devices using Swift.
If you have a solution, I’d love to hear it!

In the meantime, I’ll create the Core MIDI code (i.e. creating the client and ports) in Objective-C and call that from my Swift code.

Resources

Posted in Apple, Core MIDI, MIDI, Swift | 1 Response

Book Review: iOS 8 for Programmers: An App-Driven Approach with Swift

There is now a tidal wave of books being released on Apple’s new Swift programming language. Here, I’m going to review iOS 8 for Programmers: An App-Driven Approach with Swift (3rd Edition) (Deitel Developer Series) which was just released. For once they did not hire Yoda to write their book title as they did with Java How To Program. But they did work a colon into the title.

I have a hardcopy of the book, so I cannot speak about the quality of the ebook versions. The same content of course, but I know from producing my own epub books that the formatting can be tedious and error prone.

Readers can download a zip of the code examples from the Deitel website. Unfortunately, you have to “register” on their site to get the download link as if we are still living in 1995.

First off, the audience for the book. It is aimed at experienced programmers, especially those with experience in an object oriented language. If you are just starting out, this is probably not the book for you. If that is the case, I’d suggest Swift for Absolute Beginners which is another brand new book.

As the title suggests, this is not a Swift tutorial. Instead, you are introduced to Swift’s features by writing several toy apps. That’s what “app-driven approach” means. I really hate books and course materials that are simple laundry lists of features. In fact, over 90% of the live courses I’ve taught over the past 25 years ignored the printed course materials (unless it was one I authored :)). Laundry lists are easy on the author but hard on the learner. This app-driven approach gets closer to enabling real learning. If the learner has a question in their head while working through the material, and then see the answer a few pages later, that is excellent. Motivational seeding is what I call that. So, you will get a decent foundation in Swift, but you will not see any advanced topics. The things that I’ve banged my head against the wall with, such as interfacing with legacy APIs such are Core Audio or Core MIDI, are not touched upon. I don’t mean those APIs in particular, but interfacing with any of the legacy APIs. As is common with most iOS development books, unit testing is not covered.

The Apps

These are the Apps that the learner will build:

  • Welcome App
  • Tip Calculator App
  • Twitter Searches App
  • Flag Quiz App
  • Cannon Game App
  • Doodlz App
  • Address Book App

Each App introduces a new iOS and/or Swift feature. For example, the Cannon Game touches on Sprite Kit and the Address Book uses Core Data.

I like the format of each chapter. Each begins with a list of objectives followed by an outline. The page header for the page on the right will be an outline title. I wonder if the ebook formats the outline items as links. This seems to be a small thing, but after you’ve gone through a book, you might need to find something. This helps a lot. It also sets your expectations for what is going to be accomplished in the chapter. Not surprising, the end of each chapter has a “wrap up” telling you what they just told you. Also useful for answering “In what chapter was that thing on X covered?”

Sometimes, the author is a bit lazy. For example, section 4.3.13 talks about external parameter names. The paradigm is given but no code example. Thanks for the Amo, Amas, Amat, but where is the example sentence? Amo libri huius? Also, the Alert controller code on page 148 has a memory leak when you access the text fields in that manner. The Twitter app sidesteps Twitter’s RESTful API and uses a WebView instead. I guess NSURLSession would be too complicated or having to authenticate would be too much trouble.

There are a decent number of technologies touched upon. iCloud, Sprite Kit, Social Framework, Core Data, etc.

The book ends with a chapter on the business end and the App Store. Most developers will tell you that the coding is easier than getting it onto the App Store. Useful information is provided here.

Summary

If you are an experienced programmer, this is a good book to get to get a decent foundation in iOS development and the Swift language.
The softcover book is around 40 bucks.

You can get more information on the InformIT site.

Posted in Book Review, Swift | Tagged , | Leave a comment

iOS 8 Bluetooth MIDI LE build tip

Swift Language

iOS Bluetooth MIDI LE

Introduction

iOS 8 and OS X Yosemite now supports sending and receiving MIDI data using Bluetooth Low Energy connections on any iOS device or Mac that has native Bluetooth Low Energy support.

I reminding myself here of a simple problem I had that wasted my time.

Table of Contents

The Bluetooth classes

So, I’m playing around with the new Bluetooth LE MIDI capabilities.
Im my build settings I include the CoreAudioKit framework in order to get
the new Core Audio Bluetooth MIDI (CABTMIDI) controllers CABTMIDILocalPeripheralViewController and CABTMIDICentralViewController.

You also get the Inter-App audio classes CAInterAppAudioSwitcherView and CAInterAppAudioTransportView with CoreAudioKit, but I’m not using them here.

Here is a very simple view controller example.

import UIKit
import CoreAudioKit 
import CoreMIDI
 
class ViewController: UIViewController {
 
    var localPeripheralViewController:CABTMIDILocalPeripheralViewController?
    var centralViewController:CABTMIDICentralViewController?
 
    override func viewDidLoad() {
        super.viewDidLoad()
        localPeripheralViewController = CABTMIDILocalPeripheralViewController()
        centralViewController = CABTMIDICentralViewController()
    }
 
    @IBAction func someAction(sender: AnyObject) {
        self.navigationController?.pushViewController(localPeripheralViewController!, animated: true)
    }
 
    @IBAction func midiCentral(sender: AnyObject) {
         self.navigationController?.pushViewController(centralViewController!, animated: true)
    }
}

I played around with it, then had to do other work. I came back to it a week later, and it wouldn’t even compile. I didn’t change anything (e.g. no XCode updates). Yes, the CoreAudioKit is indeed included, but the error was one the “import CoreAudioKit”. The compiler didn’t know what that was even though the framework is there and I can even see the headers in the XCode UI tree under CoreAudioKit.framework.

It turns out that the build scheme needs to have a device selected, and not any of the simulator choices. Even if you are building and not running. The device does not need to be attached. You can just choose the first item: iOS Device. Then it will build.

D’Oh!

Apple even says so in a tech note (that I did not know existed). See the resources below.

Table of Contents

Summary

Bluetooth LE MIDI support will build only if a device is selected.

Resources

Posted in Apple, Computer Music, MIDI, Swift | Tagged , , | Leave a comment

BASH script to create a git bare repo

gnu
I can’t count how many times I’ve created a project in my IDE then dropped to the terminal to create a bare git repository, then add that as a remote to my project. And also add/commit/push. So, I decided to make my life a bit easier by writing a small shell script to do all this nonsense. You might find this useful as is or for parts you can copy and paste.

BTW., If I’m the only one working on a project right now, I find that creating the bare repo on Dropbox is handy. This won’t work of course if multiple developers are pushing at the same time.

https://gist.github.com/genedelisa/d94f68d3a78f0055806f

Posted in Uncategorized | Tagged , , , | Leave a comment

Unit testing async network calls in Swift

Swift Language

Asynchronous unit testing in Swift

You have probably written code with a NSURLSessionDataTask that notifies a delegate when the data is received. How do you write a unit test for that?

Introduction

Let’s stub out some typical code. Here a an API function that takes perhaps a REST endpoint and a delegate that receives a Thing instance. I use a NSURLSessionDataTask because I’m expecting, well, data (as JSON). I’m not showing the gory details of parsing the JSON since that’s not my point here. BTW., it’s not very difficult to parse. The idea is that a Thing is instantiated and the delegate is notified.

func getThing(url:String, delegate:ThingDelegate) {
//set up NSURLSession and request...
let task : NSURLSessionDataTask = session.dataTaskWithRequest(request, completionHandler: {(data, response, error) in
   if let e = error {
      println("Error: \(e.localizedDescription)")
   }
 
   var jsonError:NSError?
   if let json = NSJSONSerialization.JSONObjectWithData(data, options: nil, error: &jsonError) as? NSDictionary {
      if let e = jsonError {
         println("Error parsing json: \(e.localizedDescription)")
      } else {
         parse the JSON to instantiate a thing...
         delegate.didReceiveWhatever(thing)

Table of Contents

Testing

So, how do you write a unit test for this kind of code? The API call does not return anything to pass into XCTAssertTrue or siblings. Wouldn’t it be nice if you can make the network API call and wait – with a timeout of course – for a response?

Previously, you’d have to use semaphores, a spin loop, or something similar. Since this is such a common scenario, Apple gave us XCTestExpectation in XCode 6. (Actually, it’s a category in XCTextCase+AsynchronousTesting.)

Here is a simple usage example. I have an instance variable of type XCTestExpectation because I need it in the delegate callback in addition to the test function. I simply instantiate it, make the network call, then call one of the new wait functions. In this case waitForExpectationsWithTimeout. When the delegate is notified, I fulfill the expectation. If you don’t, the test will fail after the timeout.

 var expectation:XCTestExpectation?
 
    func testExample() {
 
        expectation = self.expectationWithDescription("asynchronous request")
 
        Networkclass.getThing("http://api.things.com/someid", delegate: self)
 
        self.waitForExpectationsWithTimeout(10.0, handler:nil)
 
    }
    func didReceiveWhatever(thing:Thing) {
        expectation?.fulfill()
    }

Table of Contents

Summary

Simple huh? Take a look at the documentation for a few variations.

Resources

Posted in iOS, Swift | Tagged , , | Leave a comment

Swift and AVMIDIPlayer

Swift Language

Swift and AVMIDIPlayer

How to play MIDI data via the AVFoundation AVMIDIPlayer.

Introduction

Previously, I wrote about attaching a low level core audio AUGraph to a MusicSequence to hear something besides sine waves when played via a MusicPlayer. Here, I’ll show you how to use the new higher level AVMIDIPlayer. You can even play a MusicSequence by sticking your elbow in your ear.

Playing a MIDI File

Preparing an AVMIDIPlayer to play a standard MIDI file with a SoundFont or DLS file is fairly straightforward. Get both NSURLs from your bundle, then pass them into the init function.

if let contents = NSBundle.mainBundle().URLForResource(gMajor, withExtension: "mid") {
    self.soundbank = NSBundle.mainBundle().URLForResource(soundFontMuseCoreName, withExtension: "sf2")
    if self.soundbank != nil {
        var error:NSError?
        self.mp = AVMIDIPlayer(contentsOfURL: contents, soundBankURL: soundbank!, error: &error)
        if(self.mp != nil) {
            mp!.prepareToPlay()
            setupSlider()
            // crashes if you set a completion handler
            mp!.play(nil)
        } else {
            if let e = error {
                println("Error \(e.localizedDescription)")
            }
        }
   }
}

Note that I’m passing nil to the play function. It expects a completion function. It will crash if you pass in either a function or a closure. My workaround is to pass nil.

var completion:AVMIDIPlayerCompletionHandler = {
    println("done")
}
mp!.play(completion)
 
// or even a function
func comp() -> Void {
}
mp!.play(comp)

Play your MIDI file in the simulator, and you’ll hear sine waves. Huh? A valid SoundFont was sent to the init function, and you hear sine waves? Yeah. After you spend a day verifying that your code is correct, install iOS8 on your actual device and try it there. Yup, it works. Nice.

ps. that slider thing is just some eye candy in the final project. A UISlider moves while playing.

Table of Contents

Playing NSData from a file

AVMIDIPlayer has an init function that takes an NSData instance instead of a URL. So, let’s try creating an NSData object from the URL and a simple first step.

if let contents = NSBundle.mainBundle().URLForResource(nightBaldMountain, withExtension: "mid") {
    self.soundbank = NSBundle.mainBundle().URLForResource(soundFontMuseCoreName, withExtension: "sf2")
    if self.soundbank != nil {
        var data = NSData(contentsOfURL: contents)
        var error:NSError?
        self.mp = AVMIDIPlayer(data:data, soundBankURL: soundbank!, error: &error)
        if(self.mp != nil) {
            mp!.prepareToPlay()
            setupSlider()
            mp!.play(nil)
        } else {
            if let e = error {
                println("Error \(e.localizedDescription)")
            }
        }
    }
}

Not surprisingly, that works. But why would you want to do this?

Table of Contents

Playing a MusicSequence

The hoary grizzled MusicSequence from the AudioToolbox is still the only way to create a MIDI Sequence on the fly. If you have an app where the user taps in notes, you can store them in a MusicSequence for example. But AVMIDIPlayer has no init function that takes a MusicSequence. Our choices are an NSURL or NSData.

A NSURL doesn’t make sense, but what about NSData? Can you turn a MusicSequence into NSData? Well, there’s MusicSequenceFileCreateData(). With this function, you can pass in a data variable that will be initialized to the data that would be written to a standard MIDI file. You can then use that NSData in the player code in our previous example.

func seqToData(musicSequence:MusicSequence) -> NSData {
    var status = OSStatus(noErr)
    var data:Unmanaged<CFData>?
    status = MusicSequenceFileCreateData(musicSequence,
        MusicSequenceFileTypeID(kMusicSequenceFile_MIDIType),
        MusicSequenceFileFlags(kMusicSequenceFileFlags_EraseFile),
       480, // resolution
       &data)
    return data!.takeUnretainedValue()
}

I haven’t checked to see if there is a memory leak with the takeUnretainedValue call. I’ll check that out next.

update: I checked and there is indeed a small memory leak.
The docs for MusicSequenceFileCreateData say that the caller is responsible for releasing the CFData. So OK, takeUnretainedValue is the right one. So I tried saving the data variable as an ivar, checking for nil when playing again, then calling release(). Crash. What about DisposeMusicSequence? OK, I tried saving the sequence as an ivar and calling that. No crash, but memory still leaks. CFRelease is simply unavailable.

What do you think? Advice?

Table of Contents

Summary

So you can play a MusicSequence with sounds via an AVMIDIPlayer. You just need to know the secret handshake.

Resources

Posted in Computer Music, Swift | Tagged , | 3 Responses

Swift: AUGraph and MusicSequence

Swift Language

Swift AUGraph and MusicSequence

The AudioToolbox MusicSequence remains the only way to create a MIDI Sequence programmatically. The AVFoundation class AVMIDIPlayer will play a MIDI file, but not a MusicSequence.

AVAudioEngine has a musicSequence property. It doesn’t seem to do anything yet (except crash when you set it). So the way to get a MusicSequence to play with instrument sounds is to create a low level core audio AUGraph and play the sequence with a MusicPlayer.

Introduction

Apple is moving towards a higher level Audio API with AVFoundation. The AVAudioEngine looks promising, but it is incomplete. Right now there isn’t a way to associate an AudioToolbox MusicSequence with it. So, here I’ll use a low level Core Audio AUGraph for the sounds.

Table of Contents

Create a MusicSequence

Let’s start by creating a MusicSequence with a MusicTrack that contains several MIDINoteMessages.

var musicSequence:MusicSequence = MusicSequence()
var status = NewMusicSequence(&musicSequence)
if status != OSStatus(noErr) {
    println("\(__LINE__) bad status \(status) creating sequence")
}
 
// add a track
var track:MusicTrack = MusicTrack()
status = MusicSequenceNewTrack(musicSequence, &track)
if status != OSStatus(noErr) {
    println("error creating track \(status)")
}
 
// now make some notes and put them on the track
var beat:MusicTimeStamp = 1.0
for i:UInt8 in 60...72 {
    var mess = MIDINoteMessage(channel: 0,
        note: i,
        velocity: 64,
        releaseVelocity: 0,
        duration: 1.0 )
    status = MusicTrackNewMIDINoteEvent(track, beat, &mess)
    if status != OSStatus(noErr) {
        println("error creating midi note event \(status)")
    }
     beat++
}

Table of Contents

MusicPlayer create

Now you need a MusicPlayer to hear it. Let’s make one give it out MusicSequence.
Here, I “pre roll” the player for fast startup when you hit a play button. You don’t have to do this,
but here is the way to do it.

var musicPlayer:MusicPlayer = MusicPlayer()
var status = NewMusicPlayer(&musicPlayer)
if status != OSStatus(noErr) {
    println("bad status \(status) creating player")
}
status = MusicPlayerSetSequence(musicPlayer, musicSequence)
if status != OSStatus(noErr) {
    println("setting sequence \(status)")
}
status = MusicPlayerPreroll(musicPlayer)
if status != OSStatus(noErr) {
    println("prerolling player \(status)")
}

Table of Contents

Playing a MusicSequence

Finally, you tell the player to play like this – probably from an IBAction.

status = MusicPlayerStart(musicPlayer)
if status != OSStatus(noErr) {
    println("Error starting \(status)")
    return
}

Wonderful sine waves! What if you want to hear something that approximates actual instruments?

Well, you can load SoundFont or DLS banks – or even individual sound files. Here, I’ll load a SoundFont.
Load it into what? Well, here I’ll load it into a core audio sampler – an AudioUnit. That means I’ll need to create a core audio AUGraph.

The end of the story is this, you associate an AUGraph with the MusicSequence like this.

MusicSequenceSetAUGraph(musicSequence, self.processingGraph)

Table of Contents

Create an AUGraph

Great. So how do you make an AUGraph? If you want a bit more detail, look at my blog post on it using Objective-C. Here, I’ll just outline the steps.

Create the AUGraph with NewAUGraph. It is useful to define it as an instance variable.

var processingGraph:AUGraph
 
var status = NewAUGraph(&self.processingGraph)

Table of Contents

Create sampler

To create the sampler and add it to the graph, you need to create an AudioComponentDescription.

var samplerNode:AUNode
 
var cd:AudioComponentDescription = AudioComponentDescription(
    componentType: OSType(kAudioUnitType_MusicDevice),
    componentSubType: OSType(kAudioUnitSubType_Sampler),
    componentManufacturer: OSType(kAudioUnitManufacturer_Apple),
    componentFlags: 0,
    componentFlagsMask: 0)
status = AUGraphAddNode(self.processingGraph, &cd, &samplerNode)

Table of Contents

Create IO node

Create an output node in the same manner.

var ioUnitDescription:AudioComponentDescription = AudioComponentDescription(
    componentType: OSType(kAudioUnitType_Output),
    componentSubType: OSType(kAudioUnitSubType_RemoteIO),
    componentManufacturer: OSType(kAudioUnitManufacturer_Apple),
    componentFlags: 0,
    componentFlagsMask: 0)
status = AUGraphAddNode(self.processingGraph, &ioUnitDescription, &ioNode)

Table of Contents

Obtain Audio Units

Now to wire the nodes together and init the AudioUnits. The graph needs to be open, so we do that first.
Then I obtain references to the audio units with the function AUGraphNodeInfo.

var samplerUnit:AudioUnit
var ioUnit:AudioUnit
 
status = AUGraphOpen(self.processingGraph)
 
status = AUGraphNodeInfo(self.processingGraph, self.samplerNode, nil, &samplerUnit)
 
status = AUGraphNodeInfo(self.processingGraph, self.ioNode, nil, &ioUnit)

Table of Contents

Wiring

Now wire them using AUGraphConnectNodeInput.

var ioUnitOutputElement:AudioUnitElement = 0
var samplerOutputElement:AudioUnitElement = 0
status = AUGraphConnectNodeInput(self.processingGraph,
    self.samplerNode, samplerOutputElement, // srcnode, inSourceOutputNumber
    self.ioNode, ioUnitOutputElement) // destnode, inDestInputNumber

Table of Contents

Starting the AUGraph

Now you can initialize and start the graph.

var status : OSStatus = OSStatus(noErr)
var outIsInitialized:Boolean = 0
status = AUGraphIsInitialized(self.processingGraph, &outIsInitialized)
if outIsInitialized == 0 {
    status = AUGraphInitialize(self.processingGraph)
}
 
var isRunning:Boolean = 0
AUGraphIsRunning(self.processingGraph, &isRunning)
if isRunning == 0 {
    status = AUGraphStart(self.processingGraph)
}

Table of Contents

Soundfont

Go ahead and play your MusicSequence now. Crap. Sine waves again. Well yeah, we didn’t load any sounds!

Let’s create a function to load a SoundFont, then use a “preset” from that font on the sampler unit. You need to fill out a AUSamplerInstrumentData struct. One thing that may trip you up is the fileURL which is an Unmanaged CFURL. Well, NSURL is automatically toll-free-bridged to CFURL. Cool. But it is not Unmanaged, which is what is required. So, here I’m using Unmanaged.passUnretained. If you know a better way, please let me know.

Then we need to set the kAUSamplerProperty_LoadInstrument on our samplerUnit. You do that with AudioUnitSetProperty. The preset numbers are General MIDI patch numbers. In the Github repo, I created a Dictionary of patches for ease of use and an example Picker.

func loadSF2Preset(preset:UInt8)  {
    if let bankURL = NSBundle.mainBundle().URLForResource("GeneralUser GS MuseScore v1.442", withExtension: "sf2") {
        var instdata = AUSamplerInstrumentData(fileURL: Unmanaged.passUnretained(bankURL),
                instrumentType: UInt8(kInstrumentType_DLSPreset),
                bankMSB: UInt8(kAUSampler_DefaultMelodicBankMSB),
                bankLSB: UInt8(kAUSampler_DefaultBankLSB),
                presetID: preset)
 
        var status = AudioUnitSetProperty(
                self.samplerUnit,
                UInt32(kAUSamplerProperty_LoadInstrument),
                UInt32(kAudioUnitScope_Global),
                0,
                &instdata,
                UInt32(sizeof(AUSamplerInstrumentData)))
            CheckError(status)
        }
}

Table of Contents

Summary

You can create a Core Audio AUGraph, attach it to a MusicSequence, and play it.

Resources

Posted in Computer Music, Swift | Tagged , , | 5 Responses

Swift: remove array item

Swift Language

Swift Array item removal without an index

The surprising contortions that you need to go through in order to remove an item from an array in Swift if you do not have its index in the array.

Introduction

I’m writing an app that uses standard music notation for input. Imagine a view with a staff and a tap inputs a note. Each “note view” represents a note model object. Then you decide that you do not want that note, so you need to delete it. You can get the note by pressing on it. Then that note needs to be deleted from a “notes array”.

So, you have the note, but not its index. If you had the index, Swift gives you zero trouble to remove it from the array.

notes.removeAtIndex(2)

But you don’t have the index. You have the item in the array. Well just use “indexOf”, right? Sure. Where is that? I couldn’t find anything like that. Let me know if you know of one.

What I ended up doing is removing the note by filtering the array. Here is a simple filter that removes the item.

var notes:[MIDINote] = []
 
func removeNote(note:MIDINote) {
   self.notes = self.notes.filter( {$0 != note} )
}

One problem. I’m using a comparison operator. My class didn’t have one.

Table of Contents

Comparing objects

For that != operator to work, you need to implement the Equatable protocol. There is one requirement for this protocol: you provide an overload for the == operator at global scope. “Global scope” means outside of the class. When you overload the == operator, != will work too.

Like this:

func == (lhs: MIDINote, rhs: MIDINote) -> Bool {
    if lhs.pitch.midiNumber == rhs.pitch.midiNumber &&
        lhs.duration == rhs.duration &&
        lhs.channel == rhs.channel &&
        lhs.startBeat == rhs.startBeat {
            return true
    }
    return false
}
 
class MIDINote : Equatable {
   var duration = 0.0
   var channel = 0
   var startBeat = 1.0
etc.

Table of Contents

Summary

You can remove an item from an array by writing a filter closure. But, your item must implement the Equatable protocol.
If there is a simpler way to remove an item from an array without having its index, please let me know.

Update

Many people here and in the twitterverse have kindly pointed out there there is indeed an indexOf function. But it is not named anything close to that – it is the find(array, item) function.

<soapbox>
There is a lesson in this for API writers on naming. IMHO, it is poorly named. (Is there any ambiguity in the name “indexOf”? What are the chances that a polyglot programmer would seek a method/function named indexOf vs find?). I wonder how many people are going to have indexOf in an Array extension?
</soapbox>

My other problem was finding find. In neither the Array documentation nor the Collection documentation do I see this function. Is it unreasonable for me to be looking there?
Note that filter is defined as a global function and as an Array function.

Anyway, the actual definition is this:

func find<C : CollectionType where C.Generator.Element : Equatable>(domain: C, value: C.Generator.Element) -> C.Index?

Note that it is not array specific. You can do this with other Sequences.

So, the non filter version is this:

if let index = find(self.notes, note) {
   self.notes.removeAtIndex(index)
}

I haven’t yet looked to see which is more performant. My guess is the filter version. (but not if it were a linked list).

Again, thanks for the tip.

ps
to see the undocumented functions in Swift, do this:
Put this in your code:

import Swift

Then Command-Click on Swift

Resources

Posted in Swift | Tagged | 6 Responses