iOS: trimming audio files

Swift Language

iOS: trimming audio files

I’ve written about how to record audio on iOS using Swift.
But, how do you trim the recording?


Introduction

Table of Contents

One way to trim an audio file is to use AVFoundation’s AVAssetExportSession. You create an export session instance, set its parameters, and then tell it to export the asset (your audio file) according to those parameters.

In the recorder project, we saved the URL of the recording in the soundFileURL instance variable. To use AVAssetExportSession we need to create an AVAsset from it. Here is a simple action that creates the asset and then call the export function we will discuss next.

Now to define the export func.

You create the exporter from your asset and desired file format. Here I’m using the Apple lossless format.
Then I set the exporter’s outputURL property to a file URL in the documents directory. This will be the location of the trimmed audio file.

I create a core media time range using CMTimeRangeFromTimeToTime that specifies the time offsets for the beginning and ending for the trimmed file. Here I just hard code the values, but of course you’d use a slider or a waveform view to choose the time boundaries.

While you’re there, you can also specify an AVMutableAudioMix for the volume. You can even specify a volume ramp.

Once the exporter’s properties are set, you call exportAsynchronouslyWithCompletionHandler to do the actual work. You can check the status of the export in the completion handler.

Groovy, huh?


Summary

Table of Contents

To trim an audio (or video) file, use AVFoundation’s AVAssetExportSession.


Resources

Table of Contents

Posted in Computer Music, Swift | Tagged , | Leave a comment

Swift 2 OptionSetType

Swift Language

Swift 2 OptionSetType

Swift 2/iOS 9 broke my calendar/date code.
What’s going on?


Introduction

Table of Contents

If you have an app that does anything with dates, you most likely have code someplace like this.
You get the date’s components, do something with them, and construct a new date from those components.

Broken in iOS 9.
So, the first thought I had was “Did they change the damn names again?” (MonthCalendarUnit for example was previously renamed CalendarUnitMonth).
No, that’s not it. Sort of.

What they did away with was that bitwise or-ing of components.

You do this now to specify the components.

So, MonthCalendarUnit -> CalendarUnitMonth -> Month?
There’s a “bit” more to it.


RawOptionSetType

Table of Contents

The “old” NSCalendarUnit is a RawOptionSetType.
Here is an example definition of options for a hoagie (not a sub – in spite of what The Google says – which are inferior. Hey, the name even implies “sub”standard!).

There’s some boilerplate setup funcs because of the inheritance thing, and then the individual options are defined by bit shifting 1 to create a bitmask.

This is how you would use these options. To set an option, you bitwise-or them into a variable.

You can see if an option is set by doing a bitwise-and like this.

Just like everywhere else you deal with bits. Or to set, And to get, Shift to create masks.


OptionSetType

Table of Contents

The new! and improved! way in Swift 2 is to use OptionSetType instead of RawOptionSetType.
No bits, just values. You can even define “aggregate” options like “TheWorks” in this example.

And then to use them in our domain struct:

As you see, OptionSetType provides several funcs for operating on the options.
Is this easier than dealing with bits?
Well, I still use The Emacs, so I’m the wrong guy to ask.


Some Other examples

Table of Contents

Here are a few more places you might find these new option sets.

UIViewAutoresizing is an OptionSetType.

UIUserNotificationType is an OptionSetType.

You will find many more examples and you try to run your old pre 2.0 code.


Summary

Table of Contents

OptionSetType has replaced RawOptionSetType.
You don’t have to deal with bit level operations now.
At least, not for options.


Resources

Table of Contents

Github repository

OptionSetType documentation

SetAlgebraType protocol

Do you want a really good hoagie?
Amato Brothers Deli
They aren’t paying me for this plug. I just eat there all the time.

Posted in Swift | Tagged , | Leave a comment

Swift 2 AVAudioSequencer

Swift Language

Swift 2 AVAudioSequencer

There’s a brand new MIDI sequencer class in Swift 2 beta! It’s the AVAudioSequencer.


Introduction

Table of Contents

At WWDC15 there was a presentation entitled “What’s New in Core Audio“. If you were able to get past the first 29 minutes of a poorly structured presentation delivered by a robotic mumbling developer who just reads the lines in the slides (just like 90% of other WWDC presentations), you heard about this. But then, just like every other WWDC presentation, there were incomplete code snippets.

So can we get this to work?


Sequencer setup

Table of Contents

You can play a MIDI file with the old AVMIDIPlayer class. I published a post on this back in the stone age of Swift.

Here is the old Swift 1.2 code to create one. (Swift 2 has updated the old NSError cha cha.)

Swift 2 now has a new AVAudioSequencer class.
Woo Hoo!

Ok, let’s make an AVAudioSequencer!

I’ll talk about the AVAudioEngine set up next.

So, I load a standard MIDI file that I created in Sibelius, tell the sequencer to read it, then start the sequencer. The API doesn’t look too bad at this point.


AVAudioEngine setup

Table of Contents

Let’s create the engine. According to the presentation, there doesn’t need to be much more than a sampler in the engine and the totally groovy new AVAudioSequencer will find it.

That’s all – according to the ‘AudioEngine’er’s presentation.
Good to go, right?
Wrong.
That should be it. But it’s not.
What do you get?

The ‘rioc” is the outputNode. See it?
See that the mixer, ‘mcmx’, is an input to it?
See that the sampler, ‘samp’, is connected to the mixer?
See that the formats are all the same?
The processing graph looks ok. Right?

But then….

So, “required condition is false: outputNode”.
BTW, not to be a grammar nazi, but where is the predicate in that sentence? outputNode what? It’s nil? It’s not there? It’s drunk? outputNode what?

I see the node in the graph. So, what’s the problem?

I have no idea. There is no place to look either.

I’ve tried loading the soundbank – or not.

I’ve set up the session for playback – or not.
I’ve tried with the engine running – or not.
I’ve tried with different MIDI files.
I’ve tried just connecting the sampler to the outputNode. No luck. Shouldn’t have to do that anyway.

Bleah.


AVMusicTrack

Table of Contents

Ok, let’s try the spiffy new AVMusicTrack which is so full of grooviosity that we can retire the old worn out MusicTrack from the AudioToolbox.

Right. No such luck.

I see it defined right there in AVAudioSequencer.h.
Yes, the frameworks are in the project.

Show stopper.

I see that AVAudioSequencer’s definition is preceded by the available macro, but AVMusicTrack doesn’t have that.
Is that the problem?
I’m guessing and do not have the access to try it.

So, once that’s fixed, is there an easy API to add/remove all kinds of MIDI events? You know, channel messages, sysex, controllers, metatext etc.?

Nope.

Nothing like that.
Mute, solo, length, set a destination, looping.

Sigh.


Summary

Table of Contents

So now we can do away with the AudioToolbox MusicSequence?
Nope.

So now we can do away with the AudioToolbox MusicTrack?
Nope.

So now we can connect to an AVAudioEngine without messing around with AudioUnits?
Nope.

So can we code a simple hello world with this API?
Nope.

Now that’s what I call progress.


Resources

Table of Contents

Posted in Apple, Computer Music, Core MIDI, MIDI, Swift | Tagged , | 1 Response

Swift 2 and CoreMIDI

Swift Language

Swift 2 and CoreMIDI

Swift 1.2 is notorious regarding its lack of function pointers to deal with C APIs such as Core MIDI. I’ve complained^H^H^H^H^H written about this several times.

Introduction

Well, praise Zeus, Thor, and the FSM, in Swift 2 beta, Core MIDI is easier to use without function pointers or trampolines.


MIDI Client

Table of Contents

The first step in using CoreMIDI is to create a MIDIClient reference. To do that, you use MIDIClientCreate() which takes a function pointer among other parameters. That is the problem. Swift 1.2 barfs on C function pointers.

The solution in Swift 2 (beta) is the introduction of MIDIClientCreateWithBlock() which will take a closure of type MIDINotifyBlock. This is quite similar to the Objective-C trampoline (but without the Objective-C). Take a look at the naming in my old trampoline code and tell me I’m not a mind reader :)

Groovy.
Well, sort of.
What are you going to do with that MIDINotification?

The different types of notifications are different unrelated structures.

You cannot downcast these via as?

Previously, I simply used unsafeBitCast. That does not work anymore directly. You need to use withUnsafePointer like this:

Yeah, I know.
Gross.
But it works.

By the way, MIDIClientCreate has been updated too as well as the definitions of the callbacks.


Input Port

Table of Contents

Similarly, MIDIInputPortCreate() which also takes a function pointer (to a read proc) has been upgraded to MIDIInputPortCreateWithBlock() which will take a MIDIReadBlock.

And the read block:

As you can see, iterating trough the packets has not been changed much. Swift does have MIDIPacketNext now though. It was MIA previously.

Personally, I think that if it were a SequenceType it would be much easier. You could do this:

Dream on.

Instead, you have to do some pointer nonsense. And the packets are a tuple, not an array. This example is one way to navigate the list. If you have something more elegant, let me know.

MIDIInputPortCreate() and the read function have also been updated in Swift 2 beta.


Virtual Destination

Table of Contents

To create a Virtual Destination, there is now MIDIDestinationCreateWithBlock() which also takes a MIDIReadBlock.

Remember that you have to edit your Info.plist and add the key “Required Background Modes” with the value set to audio (It will say “App plays audio or streams audio/video using AirPlay”) to create virtual destinations.


Summary

Table of Contents

Several new CoreMIDI functions have been added to Swift 2 beta. Here they are:

  • MIDIClientCreateWithBlock()
  • MIDIInputPortCreateWithBlock()
  • MIDIDestinationCreateWithBlock()
  • MIDIReadBlock
  • MIDINotifyBlock
  • MIDIPacketNext()

As always, there is my complete working example project on Github.

Have fun.


Resources

Table of Contents

Posted in Core MIDI, Swift | Tagged , | 1 Response

Swift MIDI Trampoline

Swift Language

Swift MIDI Trampoline

Swift does not support C Function pointers. I’ve written about that a few times.

So, what do you do if you need to use a C API that relies on C callbacks? Core MIDI is just one example of C APIs that rely on function pointers.

Introduction

Core MIDI has three C callbacks – to read incoming MIDI data, to read incoming Sysex data, and to be notified when MIDI inputs or outputs have changed. You can write a pure Swift app using MIDI if you only want to send data, but not if you want to read it.

To read incoming MIDI data (or use the other callbacks), you need to write the callbacks in Objective-C. Then if you want to handle the data in Swift, you give the Objective-C callback a Swift closure. So, the Objective-C code is like a trampoline – it hits the Objective-C code then bounces the data to Swift.

Table of Contents

Notify Callback

To use Core MIDI, the first thing you need to do is create the client reference by calling MIDIClientCreate. This function takes the notification callback as a parameter.

Core MIDI defines the notification callback like this:

We will write this callback in Objective-C.

MIDIClientCreate also takes a void pointer (notifyRefCon) to a parameter that will be passed into your callback as the second parameter. We will use this parameter to pass in a Swift closure. The callback will then invoke the Swift closure.

So, let’s wrap MIDIClientCreate with an Objective-C utility that will register the callback. This utility will take a block (your Swift closure) as a parameter. The signature of the Swift closure is specified here: it takes a MIDINotification as a parameter. This utility will copy the block and store it in the refCon passed to MIDIClientCreate. The trampoline will convert it back to a block and invoke it.

This is the trampoline. The void pointer refCon is converted back into a block and then invoked with the notification.

Here is the Swift code to call the Objective-C utility. myNotifyCallback is the Swift function that will be the callback.

The Swift callback is defined with the single MIDINotification parameter like this.

Cool, huh?
More like a pain in the neck, IMHO. But this is what you have to do until Swift supports function pointers.

Table of Contents

Read callback

You specify the read callback when you create the MIDI Input Port via MIDIInputPortCreate.

This is how the read callback is defined:

Notice that it also has a void pointer parameter that we will use to point to a Swift closure.

So, here is the Objective-C utility to create the input port. It uses the same pattern as the notify function. The last parameter readRefCon will be your Swift closure.

The read trampoline converts the void pointer back into a block that is then invoked.
I’m not passing the MIDIPacketList back to Swift, because Swift is not able to handle it. We have to iterate through it in Objective-C. (If you know a way to iterate through it in Swift, let me know!). So, the callback simply takes each MIDIPacket‘s data.

Here is how you call it from Swift.

And your Swift callback will be something like this.

Table of Contents

Summary

Yeah, I know. You have to jump through hoops. That’s just the status of Swift and C APIs at this point.

Resources

Posted in Core MIDI, Swift | Tagged , , | 1 Response

Disabled UIToolbar appearance

Swift Language

Disabled UIToolbar appearance

A short tip on how to show that a UIToolbar is disabled.

Introduction

I wanted to disable a UIToolbar and then enable it later via an In App Purchase. So, you just loop over your bar buttons and disable them, right? Well, no.

That does indeed disable the buttons. They are not responsive if you press them. But there is no visual cue.

So, let’s go back to the button creation code. After you create the button, you can call setTitleTextAttributes on it and specify separate NSAttributedString attributes for normal or disabled.

Great!

Doesn’t work. If you know why, let me know!

Table of Contents

How to do it

So, I tried something different. How about setting the tint color?
I tried this for the disabled appearance.

It does look disabled. The white isn’t exactly what I want, but it does take effect.
Actually it takes effect too well. The toolbars for every view controller are now white. That’s how the appearance proxy on the class level works.

The fix is easy enough; just set the individual button’s tintColor.

Now about that color. Of course you can set up your own set of colors. But let’s say you want to stay with the default system color. How do you get that? Unfortunately there is no direct way to find it. You have to get it from an unaltered UIView.

Here is how I get the default (blue) color.

So, to enable the toolbar:

To disable the toolbar, I just set the alpha on the default color to something < 1.

Table of Contents

Summary

One easy way to set a disabled appearance is to set the tintColor property of the buttons.
There may be a way to set the appearance through NSAttributedString attributes, but that didn’t work for me.

Resources

Posted in Swift | Tagged | Leave a comment

Scroll to UICollectionView header

Swift Language

Scroll to UICollectionView header in Swift

Introduction

One of the things that the UICollectionView lacks is the UITableView’s index title capability. You have to roll your own with the UICollectionView. I’ll leave that UI frob to you, but this quick post will show you how to scroll to show a section header.

Table of Contents

How to

Here is the first cut at scrolling to the second section (section 1). The problem is, it scrolls to the first item and the header is not shown. If that’s what you want, you can stop reading.

As usual, the “answers” at StackOverflow are mostly in the weeds. I don’t want to get locations directly from the layout delegate for example.

What you want to do is to ask the collection view for the layout attributes for your header view.

You can do this with a single function call:

You can then use the attribute.frame property to determine the header’s location. With that in hand, you can call setContentOffset() to scroll.

Table of Contents

Summary

Don’t mess around with doing the math yourself. Ask the collection view for the header’s attributes and use that frame to scroll.

Here is a Gist.

Resources

Posted in Swift | Tagged , | Leave a comment

Xcode 6.3 woes

Swift Language

Xcode 6.3 woes

This is a note to myself, but it may be of help to you.

I downloaded Xcode 6.3 final and also installed OS X 10.3.3. No problems with the installation.
The next day, I fired up Xcode and it froze. The spinning beachball of misery. It happened repeatedly. I had to force quit.

What to do?

How I did it

The first thing I did was to delete the derived data directories in
~/Library/Developer/Xcode/DerivedData. That didn’t fix it.

So, how to open Xcode and have it not reopen any projects that it had open?
It seems there are a few ways.

If Xcode is in your dock, you can option-shift click on it and it won’t open any projects.

Or, you can get rid of all your autosave info like this.

Finally, you can open Xcode from the command line with this incantation.

Table of Contents

Summary

Open Xcode without opening any projects. That might fix the problem.
So far so good. I can get back to work now.

Posted in Apple | Tagged | 4 Responses

Swift and C API callbacks

Swift Language

Swift and C API callbacks

Swift is supposed to have been designed for compatibility with existing Apple APIs, including C APIs such as CoreMIDI.

Introduction

Core MIDI is Apple’s C API for (surprise) MIDI. Apple provides no higher level bindings besides a player in AVFoundation. If you want to develop an app with MIDI capabilities, you need to use Core MIDI.

Like many C APIs, there are several callbacks:

  • MIDICompletionProc
  • MIDINotifyProc
  • MIDIReadProc

In context, the very first thing you’d do is to create a MIDI client. In Swift 1.2 beta, MIDIClientCreate finally works.! See my posts on the problem and the solution.

One of the parameters to MIDIClientCreate is a function pointer (MIDINotifyProc) to be called when MIDI inputs or outputs in the system have changed, such as plugging in a new device.

Another place where callbacks are used is reading MIDI input. A MIDIReadProc is called when data appears.

Table of Contents

Creating Function Pointers

Ok, so how do you make a CFunctionPointer?
Here is my attempt.
It looks like CFunctionPointer needs a COpaquePointer which needs an UnsafeMutablePointer or UnsafePointer.

BTW., the CFunctionPointer source code has this amusing comment:

CFunctionPointer is a struct with inits that will take an UnsafeMutablePointer or an UnsafePointer.

So, it looks like we need to make an unsafe pointer, use that to create an opaque pointer, then use that to create a function pointer (and a partridge in a pear tree).

Will this work with a MIDIReadProc?

It doesn’t work though with Core MIDI.
And it doesn’t matter if MyMIDIReadProc is a func, a class func, or a global func.

Here is the love letter from Core MIDI:

Swift 2 update

Table of Contents

They heard us. Swift 2 beta has introduced changes in CoreMIDI and function pointers in general.

Here are the new CoreMIDI callback signatures.

  • MIDICompletionProc
  • MIDINotifyProc
  • MIDIReadProc

Table of Contents

Summary

Swift support for C APIs is improving. Given this problem with function pointers though, it is far from being usable.

Resources

Posted in Swift | Tagged , | 3 Responses

Swift 1.2 beta2 and CoreMIDI

Swift Language

Swift 1.2 beta2 and CoreMIDI

Someone is actually working on CoreMIDI with Swift.

The Problem

Prior to Swift 1.2, various CoreMIDI types defined in MIDIServices.h were architecture dependent. Like this:

This mean that you had to deal with nonsense like this:

I blogged about the problem with creating a MIDI client ref in a 32-bit architecture. (Hint: don’t try).

Now in Swift 1.2 beta wer get this:

Those Opaque structs are now history. Yay.

But…

Let’s look at the MIDI Services reference.

Here’s a gem. Note the return type.

You would use this like so:

ItemCount is defined in MacTypes. Or it was until now. Right now there is a commment for it but no definition.
D’oh!

Workaround. Go ahead and define it yourself as it was pre-Swift 1.2.

Or just don’t use it.

Table of Contents

Summary

There is progress with using CoreMIDI from Swift.
There are still potholes though.

Resources

Posted in Core MIDI, Swift | Tagged | 1 Response