BASH script to create a git bare repo

gnu
I can’t count how many times I’ve created a project in my IDE then dropped to the terminal to create a bare git repository, then add that as a remote to my project. And also add/commit/push. So, I decided to make my life a bit easier by writing a small shell script to do all this nonsense. You might find this useful as is or for parts you can copy and paste.

BTW., If I’m the only one working on a project right now, I find that creating the bare repo on Dropbox is handy. This won’t work of course if multiple developers are pushing at the same time.

Posted in Uncategorized | Tagged , , , | Leave a comment

Unit testing async network calls in Swift

Swift Language

Asynchronous unit testing in Swift

You have probably written code with a NSURLSessionDataTask that notifies a delegate when the data is received. How do you write a unit test for that?

Introduction

Let’s stub out some typical code. Here a an API function that takes perhaps a REST endpoint and a delegate that receives a Thing instance. I use a NSURLSessionDataTask because I’m expecting, well, data (as JSON). I’m not showing the gory details of parsing the JSON since that’s not my point here. BTW., it’s not very difficult to parse. The idea is that a Thing is instantiated and the delegate is notified.

func getThing(url:String, delegate:ThingDelegate) {
//set up NSURLSession and request...
let task : NSURLSessionDataTask = session.dataTaskWithRequest(request, completionHandler: {(data, response, error) in
   if let e = error {
      println("Error: \(e.localizedDescription)")
   }
 
   var jsonError:NSError?
   if let json = NSJSONSerialization.JSONObjectWithData(data, options: nil, error: &jsonError) as? NSDictionary {
      if let e = jsonError {
         println("Error parsing json: \(e.localizedDescription)")
      } else {
         parse the JSON to instantiate a thing...
         delegate.didReceiveWhatever(thing)

Table of Contents

Testing

So, how do you write a unit test for this kind of code? The API call does not return anything to pass into XCTAssertTrue or siblings. Wouldn’t it be nice if you can make the network API call and wait – with a timeout of course – for a response?

Previously, you’d have to use semaphores, a spin loop, or something similar. Since this is such a common scenario, Apple gave us XCTestExpectation in XCode 6. (Actually, it’s a category in XCTextCase+AsynchronousTesting.)

Here is a simple usage example. I have an instance variable of type XCTestExpectation because I need it in the delegate callback in addition to the test function. I simply instantiate it, make the network call, then call one of the new wait functions. In this case waitForExpectationsWithTimeout. When the delegate is notified, I fulfill the expectation. If you don’t, the test will fail after the timeout.

 var expectation:XCTestExpectation?
 
    func testExample() {
 
        expectation = self.expectationWithDescription("asynchronous request")
 
        Networkclass.getThing("http://api.things.com/someid", delegate: self)
 
        self.waitForExpectationsWithTimeout(10.0, handler:nil)
 
    }
    func didReceiveWhatever(thing:Thing) {
        expectation?.fulfill()
    }

Table of Contents

Summary

Simple huh? Take a look at the documentation for a few variations.

Resources

Posted in iOS, Swift | Tagged , , | Leave a comment

Swift and AVMIDIPlayer

Swift Language

Swift and AVMIDIPlayer

How to play MIDI data via the AVFoundation AVMIDIPlayer.

Introduction

Previously, I wrote about attaching a low level core audio AUGraph to a MusicSequence to hear something besides sine waves when played via a MusicPlayer. Here, I’ll show you how to use the new higher level AVMIDIPlayer. You can even play a MusicSequence by sticking your elbow in your ear.

Playing a MIDI File

Preparing an AVMIDIPlayer to play a standard MIDI file with a SoundFont or DLS file is fairly straightforward. Get both NSURLs from your bundle, then pass them into the init function.

if let contents = NSBundle.mainBundle().URLForResource(gMajor, withExtension: "mid") {
    self.soundbank = NSBundle.mainBundle().URLForResource(soundFontMuseCoreName, withExtension: "sf2")
    if self.soundbank != nil {
        var error:NSError?
        self.mp = AVMIDIPlayer(contentsOfURL: contents, soundBankURL: soundbank!, error: &error)
        if(self.mp != nil) {
            mp!.prepareToPlay()
            setupSlider()
            // crashes if you set a completion handler
            mp!.play(nil)
        } else {
            if let e = error {
                println("Error \(e.localizedDescription)")
            }
        }
   }
}

Note that I’m passing nil to the play function. It expects a completion function. It will crash if you pass in either a function or a closure. My workaround is to pass nil.

var completion:AVMIDIPlayerCompletionHandler = {
    println("done")
}
mp!.play(completion)
 
// or even a function
func comp() -> Void {
}
mp!.play(comp)

Play your MIDI file in the simulator, and you’ll hear sine waves. Huh? A valid SoundFont was sent to the init function, and you hear sine waves? Yeah. After you spend a day verifying that your code is correct, install iOS8 on your actual device and try it there. Yup, it works. Nice.

ps. that slider thing is just some eye candy in the final project. A UISlider moves while playing.

Table of Contents

Playing NSData from a file

AVMIDIPlayer has an init function that takes an NSData instance instead of a URL. So, let’s try creating an NSData object from the URL and a simple first step.

if let contents = NSBundle.mainBundle().URLForResource(nightBaldMountain, withExtension: "mid") {
    self.soundbank = NSBundle.mainBundle().URLForResource(soundFontMuseCoreName, withExtension: "sf2")
    if self.soundbank != nil {
        var data = NSData(contentsOfURL: contents)
        var error:NSError?
        self.mp = AVMIDIPlayer(data:data, soundBankURL: soundbank!, error: &error)
        if(self.mp != nil) {
            mp!.prepareToPlay()
            setupSlider()
            mp!.play(nil)
        } else {
            if let e = error {
                println("Error \(e.localizedDescription)")
            }
        }
    }
}

Not surprisingly, that works. But why would you want to do this?

Table of Contents

Playing a MusicSequence

The hoary grizzled MusicSequence from the AudioToolbox is still the only way to create a MIDI Sequence on the fly. If you have an app where the user taps in notes, you can store them in a MusicSequence for example. But AVMIDIPlayer has no init function that takes a MusicSequence. Our choices are an NSURL or NSData.

A NSURL doesn’t make sense, but what about NSData? Can you turn a MusicSequence into NSData? Well, there’s MusicSequenceFileCreateData(). With this function, you can pass in a data variable that will be initialized to the data that would be written to a standard MIDI file. You can then use that NSData in the player code in our previous example.

func seqToData(musicSequence:MusicSequence) -> NSData {
    var status = OSStatus(noErr)
    var data:Unmanaged<CFData>?
    status = MusicSequenceFileCreateData(musicSequence,
        MusicSequenceFileTypeID(kMusicSequenceFile_MIDIType),
        MusicSequenceFileFlags(kMusicSequenceFileFlags_EraseFile),
       480, // resolution
       &data)
    return data!.takeUnretainedValue()
}

I haven’t checked to see if there is a memory leak with the takeUnretainedValue call. I’ll check that out next.

update: I checked and there is indeed a small memory leak.
The docs for MusicSequenceFileCreateData say that the caller is responsible for releasing the CFData. So OK, takeUnretainedValue is the right one. So I tried saving the data variable as an ivar, checking for nil when playing again, then calling release(). Crash. What about DisposeMusicSequence? OK, I tried saving the sequence as an ivar and calling that. No crash, but memory still leaks. CFRelease is simply unavailable.

What do you think? Advice?

Table of Contents

Summary

So you can play a MusicSequence with sounds via an AVMIDIPlayer. You just need to know the secret handshake.

Resources

Posted in Computer Music, Swift | Tagged , | 3 Responses

Swift: AUGraph and MusicSequence

Swift Language

Swift AUGraph and MusicSequence

The AudioToolbox MusicSequence remains the only way to create a MIDI Sequence programmatically. The AVFoundation class AVMIDIPlayer will play a MIDI file, but not a MusicSequence.

AVAudioEngine has a musicSequence property. It doesn’t seem to do anything yet (except crash when you set it). So the way to get a MusicSequence to play with instrument sounds is to create a low level core audio AUGraph and play the sequence with a MusicPlayer.

Introduction

Apple is moving towards a higher level Audio API with AVFoundation. The AVAudioEngine looks promising, but it is incomplete. Right now there isn’t a way to associate an AudioToolbox MusicSequence with it. So, here I’ll use a low level Core Audio AUGraph for the sounds.

Table of Contents

Create a MusicSequence

Let’s start by creating a MusicSequence with a MusicTrack that contains several MIDINoteMessages.

var musicSequence:MusicSequence = MusicSequence()
var status = NewMusicSequence(&musicSequence)
if status != OSStatus(noErr) {
    println("\(__LINE__) bad status \(status) creating sequence")
}
 
// add a track
var track:MusicTrack = MusicTrack()
status = MusicSequenceNewTrack(musicSequence, &track)
if status != OSStatus(noErr) {
    println("error creating track \(status)")
}
 
// now make some notes and put them on the track
var beat:MusicTimeStamp = 1.0
for i:UInt8 in 60...72 {
    var mess = MIDINoteMessage(channel: 0,
        note: i,
        velocity: 64,
        releaseVelocity: 0,
        duration: 1.0 )
    status = MusicTrackNewMIDINoteEvent(track, beat, &mess)
    if status != OSStatus(noErr) {
        println("error creating midi note event \(status)")
    }
     beat++
}

Table of Contents

MusicPlayer create

Now you need a MusicPlayer to hear it. Let’s make one give it out MusicSequence.
Here, I “pre roll” the player for fast startup when you hit a play button. You don’t have to do this,
but here is the way to do it.

var musicPlayer:MusicPlayer = MusicPlayer()
var status = NewMusicPlayer(&musicPlayer)
if status != OSStatus(noErr) {
    println("bad status \(status) creating player")
}
status = MusicPlayerSetSequence(musicPlayer, musicSequence)
if status != OSStatus(noErr) {
    println("setting sequence \(status)")
}
status = MusicPlayerPreroll(musicPlayer)
if status != OSStatus(noErr) {
    println("prerolling player \(status)")
}

Table of Contents

Playing a MusicSequence

Finally, you tell the player to play like this – probably from an IBAction.

status = MusicPlayerStart(musicPlayer)
if status != OSStatus(noErr) {
    println("Error starting \(status)")
    return
}

Wonderful sine waves! What if you want to hear something that approximates actual instruments?

Well, you can load SoundFont or DLS banks – or even individual sound files. Here, I’ll load a SoundFont.
Load it into what? Well, here I’ll load it into a core audio sampler – an AudioUnit. That means I’ll need to create a core audio AUGraph.

The end of the story is this, you associate an AUGraph with the MusicSequence like this.

MusicSequenceSetAUGraph(musicSequence, self.processingGraph)

Table of Contents

Create an AUGraph

Great. So how do you make an AUGraph? If you want a bit more detail, look at my blog post on it using Objective-C. Here, I’ll just outline the steps.

Create the AUGraph with NewAUGraph. It is useful to define it as an instance variable.

var processingGraph:AUGraph
 
var status = NewAUGraph(&self.processingGraph)

Table of Contents

Create sampler

To create the sampler and add it to the graph, you need to create an AudioComponentDescription.

var samplerNode:AUNode
 
var cd:AudioComponentDescription = AudioComponentDescription(
    componentType: OSType(kAudioUnitType_MusicDevice),
    componentSubType: OSType(kAudioUnitSubType_Sampler),
    componentManufacturer: OSType(kAudioUnitManufacturer_Apple),
    componentFlags: 0,
    componentFlagsMask: 0)
status = AUGraphAddNode(self.processingGraph, &cd, &samplerNode)

Table of Contents

Create IO node

Create an output node in the same manner.

var ioUnitDescription:AudioComponentDescription = AudioComponentDescription(
    componentType: OSType(kAudioUnitType_Output),
    componentSubType: OSType(kAudioUnitSubType_RemoteIO),
    componentManufacturer: OSType(kAudioUnitManufacturer_Apple),
    componentFlags: 0,
    componentFlagsMask: 0)
status = AUGraphAddNode(self.processingGraph, &ioUnitDescription, &ioNode)

Table of Contents

Obtain Audio Units

Now to wire the nodes together and init the AudioUnits. The graph needs to be open, so we do that first.
Then I obtain references to the audio units with the function AUGraphNodeInfo.

var samplerUnit:AudioUnit
var ioUnit:AudioUnit
 
status = AUGraphOpen(self.processingGraph)
 
status = AUGraphNodeInfo(self.processingGraph, self.samplerNode, nil, &samplerUnit)
 
status = AUGraphNodeInfo(self.processingGraph, self.ioNode, nil, &ioUnit)

Table of Contents

Wiring

Now wire them using AUGraphConnectNodeInput.

var ioUnitOutputElement:AudioUnitElement = 0
var samplerOutputElement:AudioUnitElement = 0
status = AUGraphConnectNodeInput(self.processingGraph,
    self.samplerNode, samplerOutputElement, // srcnode, inSourceOutputNumber
    self.ioNode, ioUnitOutputElement) // destnode, inDestInputNumber

Table of Contents

Starting the AUGraph

Now you can initialize and start the graph.

var status : OSStatus = OSStatus(noErr)
var outIsInitialized:Boolean = 0
status = AUGraphIsInitialized(self.processingGraph, &outIsInitialized)
if outIsInitialized == 0 {
    status = AUGraphInitialize(self.processingGraph)
}
 
var isRunning:Boolean = 0
AUGraphIsRunning(self.processingGraph, &isRunning)
if isRunning == 0 {
    status = AUGraphStart(self.processingGraph)
}

Table of Contents

Soundfont

Go ahead and play your MusicSequence now. Crap. Sine waves again. Well yeah, we didn’t load any sounds!

Let’s create a function to load a SoundFont, then use a “preset” from that font on the sampler unit. You need to fill out a AUSamplerInstrumentData struct. One thing that may trip you up is the fileURL which is an Unmanaged CFURL. Well, NSURL is automatically toll-free-bridged to CFURL. Cool. But it is not Unmanaged, which is what is required. So, here I’m using Unmanaged.passUnretained. If you know a better way, please let me know.

Then we need to set the kAUSamplerProperty_LoadInstrument on our samplerUnit. You do that with AudioUnitSetProperty. The preset numbers are General MIDI patch numbers. In the Github repo, I created a Dictionary of patches for ease of use and an example Picker.

func loadSF2Preset(preset:UInt8)  {
    if let bankURL = NSBundle.mainBundle().URLForResource("GeneralUser GS MuseScore v1.442", withExtension: "sf2") {
        var instdata = AUSamplerInstrumentData(fileURL: Unmanaged.passUnretained(bankURL),
                instrumentType: UInt8(kInstrumentType_DLSPreset),
                bankMSB: UInt8(kAUSampler_DefaultMelodicBankMSB),
                bankLSB: UInt8(kAUSampler_DefaultBankLSB),
                presetID: preset)
 
        var status = AudioUnitSetProperty(
                self.samplerUnit,
                UInt32(kAUSamplerProperty_LoadInstrument),
                UInt32(kAudioUnitScope_Global),
                0,
                &instdata,
                UInt32(sizeof(AUSamplerInstrumentData)))
            CheckError(status)
        }
}

Table of Contents

Summary

You can create a Core Audio AUGraph, attach it to a MusicSequence, and play it.

Resources

Posted in Computer Music, Swift | Tagged , , | 5 Responses

Swift: remove array item

Swift Language

Swift Array item removal without an index

The surprising contortions that you need to go through in order to remove an item from an array in Swift if you do not have its index in the array.

Introduction

I’m writing an app that uses standard music notation for input. Imagine a view with a staff and a tap inputs a note. Each “note view” represents a note model object. Then you decide that you do not want that note, so you need to delete it. You can get the note by pressing on it. Then that note needs to be deleted from a “notes array”.

So, you have the note, but not its index. If you had the index, Swift gives you zero trouble to remove it from the array.

notes.removeAtIndex(2)

But you don’t have the index. You have the item in the array. Well just use “indexOf”, right? Sure. Where is that? I couldn’t find anything like that. Let me know if you know of one.

What I ended up doing is removing the note by filtering the array. Here is a simple filter that removes the item.

var notes:[MIDINote] = []
 
func removeNote(note:MIDINote) {
   self.notes = self.notes.filter( {$0 != note} )
}

One problem. I’m using a comparison operator. My class didn’t have one.

Table of Contents

Comparing objects

For that != operator to work, you need to implement the Equatable protocol. There is one requirement for this protocol: you provide an overload for the == operator at global scope. “Global scope” means outside of the class. When you overload the == operator, != will work too.

Like this:

func == (lhs: MIDINote, rhs: MIDINote) -> Bool {
    if lhs.pitch.midiNumber == rhs.pitch.midiNumber &&
        lhs.duration == rhs.duration &&
        lhs.channel == rhs.channel &&
        lhs.startBeat == rhs.startBeat {
            return true
    }
    return false
}
 
class MIDINote : Equatable {
   var duration = 0.0
   var channel = 0
   var startBeat = 1.0
etc.

Table of Contents

Summary

You can remove an item from an array by writing a filter closure. But, your item must implement the Equatable protocol.
If there is a simpler way to remove an item from an array without having its index, please let me know.

Update

Many people here and in the twitterverse have kindly pointed out there there is indeed an indexOf function. But it is not named anything close to that – it is the find(array, item) function.

<soapbox>
There is a lesson in this for API writers on naming. IMHO, it is poorly named. (Is there any ambiguity in the name “indexOf”? What are the chances that a polyglot programmer would seek a method/function named indexOf vs find?). I wonder how many people are going to have indexOf in an Array extension?
</soapbox>

My other problem was finding find. In neither the Array documentation nor the Collection documentation do I see this function. Is it unreasonable for me to be looking there?
Note that filter is defined as a global function and as an Array function.

Anyway, the actual definition is this:

func find<C : CollectionType where C.Generator.Element : Equatable>(domain: C, value: C.Generator.Element) -> C.Index?

Note that it is not array specific. You can do this with other Sequences.

So, the non filter version is this:

if let index = find(self.notes, note) {
   self.notes.removeAtIndex(index)
}

I haven’t yet looked to see which is more performant. My guess is the filter version. (but not if it were a linked list).

Again, thanks for the tip.

ps
to see the undocumented functions in Swift, do this:
Put this in your code:

import Swift

Then Command-Click on Swift

Resources

Posted in Swift | Tagged | 5 Responses

Swift dragging a UIView with snap

Swift Language

Swift dragging a UIView

Here is one simple way to drag a UIView.

Introduction

There are many ways to drag a UIView around. In my example on Github, I drag a custom UIView subclass that does nothing special besides drawing itself. In real life you’d probably have additional code in it. (One hint at that is in my example view, I fill the rectangle with a color instead of simply setting its backgroundColor property).

I could have put the event handling in the custom UIView.
Something like this:

override func touchesBegan(touches: NSSet!, withEvent event: UIEvent!) {
etc

Nah. Here I’m going to use a UIPanGestureRecognizer. In the ViewController, I’ll install the recognizer on the “parent” view.

var pan = UIPanGestureRecognizer(target:self, action:"pan:")
pan.maximumNumberOfTouches = 1
pan.minimumNumberOfTouches = 1
self.view.addGestureRecognizer(pan)

Table of Contents

Beginning the drag

In the recognizer action, I first grab the location of the event. Then, depending on the state of the recognizer, I implement the different parts of the drag functionality.

First, you need to get the subview that you clicked upon. I do this in the .Began state.

func pan(rec:UIPanGestureRecognizer) {
 
        var p:CGPoint = rec.locationInView(self.view)
        var center:CGPoint = CGPointZero
 
        switch rec.state {
        case .Began:
            println("began")
            self.selectedView = view.hitTest(p, withEvent: nil)
            if self.selectedView != nil {
                self.view.bringSubviewToFront(self.selectedView!)
            }
etc.

Table of Contents

Dragging

The actual dragging takes place in the .Changed state. If there is a view selected, I store it’s center property. Then, I calculate how far the touch moved. You can use this as a threshold. Since I want to be abe to configure whether the view can be dragged in the x or y direction (or both), I use two instance variables shouldDragX and shouldDragY. If these are true I set the center property of the selected view to the new location. This location has been “snapped” by a snap value. For example, if snapX is 25.0, the view will be dragged only in increments of 25.0.

case .Changed:
            if let subview = selectedView {
                center = subview.center
                var distance = sqrt(pow((center.x - p.x), 2.0) + pow((center.y - p.y), 2.0))
                println("distance \(distance)")
 
                if subview is MyView {
                    if distance > threshold {
                        if shouldDragX {
                            subview.center.x = p.x - (p.x % snapX)
                        }
                        if shouldDragY {
                            subview.center.y = p.y - (p.y % snapY)
                        }
                    }
                }
            }
etc.

Table of Contents

Ending the drag

Then, in the .Ended state, I set the selectedView to nil to start over. You can also do whatever processing you need here.

case .Ended:
   if let subview = selectedView {
      if subview is MyView {
          // do whatever
      }
   }
   // must do this of course
   selectedView = nil

Table of Contents

Summary

Install a UIPanGestureRecognizer on a parent view to drag a subview. It goes without saying that the parent view should do no layout on its subviews, nor should they have any constraints.

This is not “drag and drop”, because there is no data transfer. You are simply rearranging the location of a UIView.

Resources

Posted in Swift | Tagged , | Leave a comment

AVFoundation audio recording with Swift

Swift Language

Swift AVFoundation Recorder

Use AVFoundation to create an audio recording.

Introduction

AVFoundation makes audio recording a lot simpler than recording using Core Audio. Essentially, you simply configure and create an AVAudioRecorder instance, and tell it to record/stop in actions.

Creating a Recorder

The first thing you need to do when creating a recorder is to specify the audio format that the recorder will use. This is a Dictionary of settings. For the AVFormatIDKey there are several
predefined Core Audio data format identifiers such as kAudioFormatLinearPCM and kAudioFormatAC3. Here are a few settings to record in Apple Lossless format.

var recordSettings = [
   AVFormatIDKey: kAudioFormatAppleLossless,
   AVEncoderAudioQualityKey : AVAudioQuality.Max.toRaw(),
   AVEncoderBitRateKey : 320000,
   AVNumberOfChannelsKey: 2,
   AVSampleRateKey : 44100.0
]

Then you create the recorder with those settings and the URL of the output sound file. If the recorder is created successfully, you can then call prepareToRecord() which will create or overwrite the sound file at the specified URL. If you’re going to write a VU meter style graph, you can tell the recorder to meter the recording. You’ll have to install a timer to periodically ask the recorder for the values. (See the github project).

var error: NSError?
self.recorder = AVAudioRecorder(URL: soundFileURL, settings: recordSettings, error: &error)
if let e = error {
   println(e.localizedDescription)
} else {
   recorder.delegate = self
   recorder.meteringEnabled = true
   recorder.prepareToRecord() // creates/overwrites the file at soundFileURL
}

Table of Contents

Recorder Delegate

I set the recorder’s delegate in order to be notified that the recorder has stopped recording. At this point you can update the UI (e.g. enable a disabled play button) and/or prompt the user to keep or discard the recording. In this example I use the new iOS 8 UIAlertController class. If the user says “delete the recording”, simply call deleteRecording() on the recorder instance.

extension RecorderViewController : AVAudioRecorderDelegate {
 
    func audioRecorderDidFinishRecording(recorder: AVAudioRecorder!,
        successfully flag: Bool) {
            println("finished recording \(flag)")
            stopButton.enabled = false
            playButton.enabled = true
            recordButton.setTitle("Record", forState:.Normal)
 
            // ios8 and later
            var alert = UIAlertController(title: "Recorder",
                message: "Finished Recording",
                preferredStyle: .Alert)
            alert.addAction(UIAlertAction(title: "Keep", style: .Default, handler: {action in
                println("keep was tapped")
            }))
            alert.addAction(UIAlertAction(title: "Delete", style: .Default, handler: {action in
                self.recorder.deleteRecording()
            }))
            self.presentViewController(alert, animated:true, completion:nil)
    }
 
    func audioRecorderEncodeErrorDidOccur(recorder: AVAudioRecorder!,
        error: NSError!) {
            println("\(error.localizedDescription)")
    }
}

Table of Contents

Recording

In order to record, you need to ask the user for permission to record first. The AVAudioSession class has a requestRecordPermission() function to which you provide a closure. If granted, you set the session’s category to AVAudioSessionCategoryPlayAndRecord, set up the recorder as described above, and install a timer if you want to check the metering levels.

AVAudioSession.sharedInstance().requestRecordPermission({(granted: Bool)-> Void in
   if granted {
      self.setSessionPlayAndRecord()
      self.setupRecorder()
      self.recorder.record()
      self.meterTimer = NSTimer.scheduledTimerWithTimeInterval(0.1,
         target:self,
         selector:"updateAudioMeter:",
         userInfo:nil,
         repeats:true)
    } else {
      println("Permission to record not granted")
    }
})

Here is a very simple function to display the metering level to stdout, as well as displaying the current recording time. Yes, string formatting is awkward in Swift. Have a better way? Let me know.

func updateAudioMeter(timer:NSTimer) {
   if recorder.recording {
      let dFormat = "%02d"
      let min:Int = Int(recorder.currentTime / 60)
      let sec:Int = Int(recorder.currentTime % 60)
      let s = "\(String(format: dFormat, min)):\(String(format: dFormat, sec))"
      statusLabel.text = s
      recorder.updateMeters()
      var apc0 = recorder.averagePowerForChannel(0)
      var peak0 = recorder.peakPowerForChannel(0)
print them out...
   }
}

Table of Contents

Summary

That’s it. You now have an audio recording that you can play back using an AVAudioPlayer instance.

Resources

Posted in Computer Music, Swift | Tagged , | 8 Responses

Swift AVFoundation to play audio or MIDI

Swift Language

Swift AVFoundation

There are many ways to play sound in iOS. Core Audio has been around for a while and it is very powerful. It is a C API, so using it from Objective-C and Swift is possible, but awkward. Apple has been moving towards a higher level API with AVFoundation. Here I will summarize how to use AVFoundation for several common audio tasks.

N.B. Some of these examples use new capabilities of iOS 8.

Playing an Audio file

Let’s start by loading an audio file with an AVAudioPlayer instance. There are several audio formats that the player will grok. I had trouble with a few MP3 files that played in iTunes or VLC, but caused a cryptic exception in the player. So, check your source audio files first.

If you want other formats, your Mac has a converter named afconvert. See the man page.

afconvert -f caff -d LEI16 foo.mp3 foo.caf

Let’s go step by step.

Get the file URL.

let fileURL:NSURL = NSBundle.mainBundle().URLForResource("modem-dialing-02", withExtension: "mp3")

Create the player. You will need to make the player an instance variable, because if you just use a local variable, it will be popped off the stack before you hear anything.

var error: NSError?
self.avPlayer = AVAudioPlayer(contentsOfURL: fileURL, error: &error)
if avPlayer == nil {
   if let e = error {
      println(e.localizedDescription)
   }
}

You can provide the player a hint for how to parse the audio data. There are several constants you can use.

self.avPlayer = AVAudioPlayer(contentsOfURL: fileURL, fileTypeHint: AVFileTypeMPEGLayer3, error: &error)

Now configure the player. prepareToPlay() “pre-rolls” the audio file to reduce start up delays when you finally call play().
You can set the player’s delegate to track status.

avPlayer.delegate = self
avPlayer.prepareToPlay()
avPlayer.volume = 1.0

To set the delegate you have to make a class implement the player delegate protocol. My class has the clever name “Sound”.

// MARK: AVAudioPlayerDelegate
extension Sound : AVAudioPlayerDelegate {
    func audioPlayerDidFinishPlaying(player: AVAudioPlayer!, successfully flag: Bool) {
        println("finished playing \(flag)")
    }
     func audioPlayerDecodeErrorDidOccur(player: AVAudioPlayer!, error: NSError!) {
        println("\(error.localizedDescription)")
    }
}

Finally, the transport controls that can be called from an action.

func stopAVPLayer() {
   if avPlayer.playing {
      avPlayer.stop()
   }
}
 
func toggleAVPlayer() {
   if avPlayer.playing {
      avPlayer.pause() 
   } else {
      avPlayer.play()
   }
}

The complete gist for the AVAudioPlayer:

Audio Session

The Audio Session singleton is an intermediary between your app and the media daemon. Your app and all other apps (should) make requests to the shared session. Since we are playing an audio file, we should tell the session that is our intention by requesting that its category be AVAudioSessionCategoryPlayback, and then make the session active. You should do this in the code above right before you call play() on the player.

Setting a session for playback.

Go to Table of Contents

Playing a MIDI file

You use AVMIDIPlayer to play standard MIDI files. Loading the player is similar to loading the AVAudioPlayer. You need to load a soundbank from a Soundfont or DLS file. The player also has a pre-roll prepareToPlay() function.

I’m not interested in copyright infringement, so I have not included either a DLS or SF2 file. So do a web search for a GM SoundFont2 file. They are loaded in the same manner. I’ve tried the MuseCore SoundFont and it sounds ok. There is probably a General MIDI DLS on your OSX system already: /System/Library/Components/CoreAudio.component/Contents/Resources/gs_instruments.dls. Copy this to the project bundle if you want to try it.

self.soundbank = NSBundle.mainBundle().URLForResource("GeneralUser GS MuseScore v1.442", withExtension: "sf2")
// a standard MIDI file.
var contents:NSURL = NSBundle.mainBundle().URLForResource("ntbldmtn", withExtension: "mid")
var error:NSError?
self.mp = AVMIDIPlayer(contentsOfURL: contents, soundBankURL: soundbank, error: &error)
if self.mp == nil {
   println("nil midi player")
}
if let e = error {
   println("Error \(e.localizedDescription)")
}
self.mp.prepareToPlay()

You can also load the MIDI player with an NSData instance like this:

var data = NSData(contentsOfURL: contents)
var error:NSError?
self.mp = AVMIDIPlayer(data: data, soundBankURL: soundbank, error: &error)

Cool, so besides getting the data from a file, how about creating a sequence on the fly? There are the Core Audio MusicSequence and MusicTrack classes to do that. But damned if I can find a way to turn the sequence into NSData. Do you? FWIW, the AVAudioEngine q.v. has a barely documented musicSequence variable. Maybe we can use that in the future.

In your action, call the play() function on the player. There is only one play function, and that requires a completion handler.

self.mp.play({
   println("midi done")
})

Complete AVMIDIPlayer example gist.

Go to Table of Contents

Audio Engine

iOS 8 introduces a new audio engine which seems to be the successor to Core Audio’s AUGraph and friends. See my article on using these classes in Swift.

The new AVAudioEngine class is the analog to AUGraph. You create AudioNode instances and attach them to the engine. Then you start the engine to initiate data flow.

Here is an engine that has a player node attached to it. The player node is attached to the engine’s mixer. These are instance variables.

engine = AVAudioEngine()
playerNode = AVAudioPlayerNode()
engine.attachNode(playerNode)
mixer = engine.mainMixerNode
engine.connect(playerNode, to: mixer, format: mixer.outputFormatForBus(0))

Then you need to start the engine.

var error:NSError?
if !engine.startAndReturnError(&error) {
   println("error couldn't start engine")
   if let e = error {
      println("error \(e.localizedDescription)")
   }
}

Cool. Silence.

Let’s give it something to play. It can be an audio file, or as we’ll see, a MIDI file or a computed buffer.
In this example we create an AVAudioFile instance from an MP3 file, and tell the playerNode to play it.

First, load an audio file. If you know the format of the file you can provide hints.

let fileURL = NSBundle.mainBundle().URLForResource("modem-dialing-02", withExtension: "mp3")
var error: NSError?
let audioFile = AVAudioFile(forReading: fileURL, error: &error)
// OR
//let audioFile = AVAudioFile(forReading: fileURL, commonFormat: .PCMFormatFloat32, interleaved: false, error: &error)
if let e = error {
   println(e.localizedDescription)
}

Now hand the audio file to the player node by “scheduling” it, then playing it.

engine.connect(playerNode, to: engine.mainMixerNode, format: audioFile.processingFormat)
playerNode.scheduleFile(audioFile, atTime:nil, completionHandler:nil)
if engine.running {
   playerNode.play()
} else {
   if !engine.startAndReturnError(&error) {
      println("error couldn't start engine")
      if let e = error {
         println("error \(e.localizedDescription)")
      }
   } else {
      playerNode.play()
   }
}

Go to Table of Contents

Playing MIDI Notes

How about triggering MIDI notes/events based on UI events? You need an instance of AVAudioUnitMIDIInstrument among your nodes. There is one concrete subclass named AVAudioUnitSampler. Create a sampler and attach it to the engine.

sampler = AVAudioUnitSampler()
engine.attachNode(sampler)
engine.connect(sampler, to: engine.outputNode, format: nil)

At init time, create a URL to your SoundFont or DLS file as we did previously.

soundbank = NSBundle.mainBundle().URLForResource("GeneralUser GS MuseScore v1.442", withExtension: "sf2")

Then in your UI’s action function, load the appropriate instrument into the sampler. The program parameter is a General MIDI instrument number. You might want to set up constants. Soundbanks have banks of sound. You need to specify which bank to use with the bankMSB and bankLSB. I use a Core Audio constant here to choose the “melodic” bank and not the “percussion” bank.

// probably instance variables
let melodicBank:UInt8 = UInt8(kAUSampler_DefaultMelodicBankMSB)
let gmMarimba:UInt8 = 12
let gmHarpsichord:UInt8 = 6
 
// then in the action
var error:NSError?
if !sampler.loadSoundBankInstrumentAtURL(soundbank, program: gmHarpsichord,
            bankMSB: melodicBank, bankLSB: 0, error: &error) {
   println("could not load soundbank")
}
if let e = error {
   println("error \(e.localizedDescription)")
}

Then send a MIDI program change to the sampler. After that, you can send startNote and stopNote messages to the sampler. You need to match the parameters for each start and stop message.

self.sampler.sendProgramChange(gmHarpsichord, bankMSB: melodicBank, bankLSB: 0, onChannel: 0)
// play middle C, mezzo forte on MIDI channel 0
self.sampler.startNote(60, withVelocity: 64, onChannel: 0)
...
// in another action
self.sampler.stopNote(60, onChannel: 0)

Go to Table of Contents

Summary

This is a good start I hope. There are other things I’ll cover soon, such as generating and processing the audio buffer data.

Resources

Go to Table of Contents

Posted in Swift | Tagged , | 11 Responses

Swift Dropbox quick tip

Swift Language

Quick Tip

If you’re going to use the Dropbox API in your Swift app, you will need a bridging header. If you don’t have one, just create a dummy Objective-C class and you will be prompted to have one created for you. Then delete the dummy class.

Then add DropboxSDK.h to the bridging header.
Blammo. Syntax errors in the framework. Doesn’t know things like NSCoding.
The current headers have this all over the place: #ifdef __OBJC__
Well, we’re not in Objective-C anymore Toto.

So, add a few more imports like this to the bridging header:

#import <UIKit/UIKit.h>
#import <Foundation/Foundation.h>
#import <DropboxSDK/DropboxSDK.h>
Posted in Swift | Tagged , | Leave a comment

Swift documentation

Swift Language

In the release notes for XCode 6 beta 5, they mention that they are using reStructuredText (quick reference) for javadoc style documentation.

It has a long way to go, but it’s a start.

Like Java, you can create documentation blocks like this:

/** 
whatever
*/

N.B. For old timey Objective-C guys, HeaderDoc uses /*! instead of /**

Or you can use three virgules (///) at the beginning of a line for single liners. Open Quick Help Inspector to see the comments, or option click on a variable/func of the type you’re documenting.

The notes state

Currently only block-level markup is supported (nested bullet and enumerated lists, field lists) [are supported].

Let’s see some field lists

/** 
This is a utility class to help clean the litterbox.
:Author: Gene De Lisa
:Version: 1.0 of 2014/08/05
:Dedication: To my cat, Giacomo.
*/
class LitterBox {
...

How about bullet lists? Yes they work.

/**
The metadata retrieved from the ipod library.
 
- albumTitle
- songs
- artwork
*/
struct AlbumInfo {
...

Enumerated lists? Not so much, even though they say they do. And other formatting like *bold* and **bolder** doesn’t work (they didn’t say it would yet).

You can use field lists for param and returns like this:

 /**
    Queries the library for an artist name containing the parameter
 
    :param: artist The artist name
 
    :returns: Nothing. The delegate is notified.
    */
    func albums(artist:String) {
...

Formatting code in your comments? I don’t see anything yet. Also, in build settings, you can ask to be warned about Documentation Comments that are invalid. That doesn’t work for these comments yet.

Posted in Swift | Tagged , , | Leave a comment