Category

Swift 2: AVFoundation to play audio or MIDI

Swift Language

Swift AVFoundation

There are many ways to play sound in iOS. Core Audio has been around for a while and it is very powerful. It is a C API, so using it from Objective-C and Swift is possible, but awkward. Apple has been moving towards a higher level API with AVFoundation. Here I will summarize how to use AVFoundation for several common audio tasks.

N.B. Some of these examples use new capabilities of iOS 8.

This is a newer version of this Swift 1 blog post.

Playing an Audio file

Let’s start by loading an audio file with an AVAudioPlayer instance. There are several audio formats that the player will grok. I had trouble with a few MP3 files that played in iTunes or VLC, but caused a cryptic exception in the player. So, check your source audio files first.

If you want other formats, your Mac has a converter named afconvert. See the man page.

Let’s go step by step.

Get the file URL.

Create the player. You will need to make the player an instance variable. If you just use a local variable, it will be popped off the stack before you hear anything!

You can provide the player a hint for how to parse the audio data. There are several constants for file type UTIs you can use. For our MP3 file, we’ll use AVFileTypeMPEGLayer3.

Now configure the player. prepareToPlay() “pre-rolls” the audio file to reduce start up delays when you finally call play().
You can set the player’s delegate to track status.

To set the delegate you have to make a class implement the player delegate protocol. My class has the clever name “Sound”. The delegate protocol requires the NSObjectProtocol, so Sound is a subclass of NSObject.

Finally, the transport controls that can be called from an action.

Audio Session

The Audio Session singleton is an intermediary between your app and the media daemon. Your app and all other apps (should) make requests to the shared session. Since we are playing an audio file, we should tell the session that is our intention by requesting that its category be AVAudioSessionCategoryPlayback, and then make the session active. You should do this in the code above right before you call play() on the player.

Setting a session for playback.

Go to Table of Contents

Playing a MIDI file

You use AVMIDIPlayer to play standard MIDI files. Loading the player is similar to loading the AVAudioPlayer. You need to load a soundbank from a Soundfont or DLS file. The player also has a pre-roll prepareToPlay() function.

I’m not interested in copyright infringement, so I have not included either a DLS or SF2 file. So do a web search for a GM SoundFont2 file. They are loaded in the same manner. I’ve tried the MuseCore SoundFont and it sounds ok. There is probably a General MIDI DLS on your OSX system already: /System/Library/Components/CoreAudio.component/Contents/Resources/gs_instruments.dls. Copy this to the project bundle if you want to try it.

Go to Table of Contents

Audio Engine

iOS 8 introduces a new audio engine which seems to be the successor to Core Audio’s AUGraph and friends. See my article on using these classes in Swift.

The new AVAudioEngine class is the analog to AUGraph. You create AudioNode instances and attach them to the engine. Then you start the engine to initiate data flow.

Here is an engine that has a player node attached to it. The player node is attached to the engine’s mixer. These are instance variables.

Then you need to start the engine.

Cool. Silence.

Let’s give it something to play. It can be an audio file, or as we’ll see, a MIDI file or a computed buffer.
In this example we create an AVAudioFile instance from an MP3 file, and tell the playerNode to play it.

First, load an audio file. Or load an audio file into a buffer.

Now we hand the buffer to the player node by “scheduling” it, then playing it.

There are quite a few variations on scheduleBuffer. Have fun trying them out.

Go to Table of Contents

Playing MIDI Notes

How about triggering MIDI notes/events based on UI events? You need an instance of AVAudioUnitMIDIInstrument among your nodes. There is one concrete subclass named AVAudioUnitSampler. Create a sampler and attach it to the engine.

In your UI’s action function, load the appropriate instrument into the sampler. The program parameter is a General MIDI instrument number. You might want to set up constants. Soundbanks have banks of sound. You need to specify which bank to use with the bankMSB and bankLSB. I use Core Audio constants here to choose the “melodic” bank and not the “percussion” bank.

Then send a MIDI program change by calling our load function. After that, you can send startNote and stopNote messages to the sampler. You need to match the parameters for each start and stop message.

Go to Table of Contents

Summary

This is a good start I hope. There are other things I’ll cover soon, such as generating and processing the audio buffer data.

Resources

Go to Table of Contents

13 thoughts on “Swift 2: AVFoundation to play audio or MIDI”

  1. Hi

    Thank you for your explanations,
    I greatly assisted by your excellent blog!

    I wanted to ask which way I’m connects the avmidiplayer to the sampler ֿ
    i.e
    If I want to apply midi commands (commands that exist in the sampler)
      As noteOn, change channel … on the file that plays now in avmidiplayer
    in real time.

    How do we do this?

    Thanks again
    Rebecca

    1. AVMidiPlayer is a simple class that simply plays a MIDI file through a soundbank.

      I you want to send MIDI messages, it is the wrong class to use.

      AVAudioUnitSampler has functions that it inherits from AVAudioUnitMIDIInstrument to send MIDI messages.

  2. Hi Gene De Lisa,
    I’m trying to convert a midi file to m4a using swift2 in iOS, it does seem doable, via AVAudioEngine.
    My problem is how do I load a musicSequence into a AVAudioPCMBuffer?
    Thanks
    Z

    1. You can’t load a MusicSequence into an AVAudioPCMBuffer. You load samples into it either by computing a waveform or loading a sound file.

      1. Ok, good to know. I’ll try again. The musicSequence starts as a MIDIfile and is turned into audio somewhere along the way, presumably it is held in a AVAudioPcmBuffer?
        Is it possible to capture or Tap the audio and save it as an M4a file.
        I have watched the WWDC 2014-2015 coreAudio video’s and still finding it tricky getting my head around it.
        Thank you again
        Z

        1. The engine renders the MIDI commands in the MusicSequence through the sampler. You probably have the sampler connected to the engine’s mixer. You can tap the mixer’s output and save that.

          1. So, I can play a musicSequence through a musicPlayer, no problem there.
            And according to the Apple doc’s in Playing Audio section, there is a music sequence variable available and also says …The music sequence that was previously attached to the audio engine…..How and where would I attach the musicSequence or musicPlayer to the audioEngine?
            Thank you again.

  3. Gene,

    I’m trying to use your code above to play guitar notes and guitar chords. I am successfully using it for guitar notes, but chords are not working very well. I have tried playing multiple simultaneous notes like this….

    func gstart(noteNumber: UInt8, channel: UInt8) {
    loadPatch(gmNylonGuitar, channel: channel)
    self.sampler.startNote(40, withVelocity: 100, onChannel: 1)
    self.sampler.startNote(48, withVelocity: 100, onChannel: 2)
    self.sampler.startNote(52, withVelocity: 100, onChannel: 3)
    self.sampler.startNote(55, withVelocity: 100, onChannel: 1)
    self.sampler.startNote(60, withVelocity: 100, onChannel: 2)
    self.sampler.startNote(64, withVelocity: 100, onChannel: 3)
    }

    One or two notes works ok, if I add more notes I start getting SIGABRT in loadPatch. I have also experimented with trying to use multiple channels but that doesn’t help. As you can probably tell, I don’t know much about MIDI.

    I am using Xcode 8.0, Swift 3, and running on iOS 10. I am also using gs_instruments.dls for the sound bank.

    Do you have any suggestions on how I can play chords on iOS, preferably using midi? These chords will have up to 6 notes playing simultaneously. If I have to I can always record chords and play those sound files but I would prefer to try to use MIDI.

    1. One particular use for sending a startNote message to a sampler is in response to a UI event like touch down – followed by an endNote on touch up.

      For what you want, I suggest that you use an AudioToolbox MusicSequence and add notes to a MusicTrack with the same beat.


      var mess = MIDINoteMessage(channel: 0,
      note: 60,
      velocity: 64,
      releaseVelocity: 0,
      duration: 1.0 )
      var status = MusicTrackNewMIDINoteEvent(track, beat, &mess)
      //create and add more notes at the same beat

  4. Hi Gene De Lisa,
    Is it possible to display text, karaoke style while playing a MIDI file in iOS with Swift?
    Could you point me in the right direction?
    Thanks again
    Z

Leave a Reply to Manjit Bedi Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.