Thursday 21 October 2010

Using CoreMIDI in iOS (an example)

In iOS 4.2 the CoreMIDI framework is dropping into the standard operating system. MIDI connectivity using generic class-compliant USB MIDI hardware will be available to all.

It's not a big framework, with a handful of pure-C APIs. But its affect will be profound. What was previously only possible using Akai's AkaiConnect SDK (which is a very nice Objective-C API) or Line 6's MIDI Mobilizer SDK will be available to all.

From what I can see, the iOS CoreMIDI version is going to be the exact same API as the Mac OS desktop variant. I expect developer take-up of this API to be far faster than for the existing two (non-standard) iOS MIDI interface APIs. Pretty soon every iPhone synth app will support Apple's generic MIDI API.

That is, if they can work out how to.

CoreMIDI isn't the best documented API (this is all you get: a list of typedefs and functions). Fortunately, the headers are well-commented and the API is mercifully simple and sensible. However, with no compelling examples (that I could find) it takes a little digging to work out how to use CoreMIDI.

So here I present a very simple example project, with a reusable Objective-C MIDI interface class. With this code, you can get up to speed with CoreMIDI quickly. Indeed, if you just paste my MidiInput class into your synth app you'd be good to go.

Grab it from the GitHub project here. (Update note: the repo master used to be hosted on Gitorious here but we've moved to GitHub in these enlightened times)

Please let me know if you find it useful, or if you use it in your own projects.

Integrating CoreMIDI in your application

First, you need to decide if you want your app to support devices running iOS versions before 4.2. Since CoreMIDI is only introduced in 4.2, you have to jump through a few hoops to keep your app running on earlier versions:
  • Weakly link to the CoreMIDI framework, so on OS versions without it your application will still launch. (You do this by going to your application's target in the Xcode tree view, selecting "Get info", and in the "General" tab's "Linked libraries" section ensuring that CoreMIDI is set to "Weak" not "Required".)
  • Including CoreMIDI functionality conditionally. The best way to do this is inspect the kCFCoreFoundationVersionNumber variable and only initialise your MIDI handling if the value represents iOS 4.2 or later. (See the iOS version detection header file in my example project for an elegant way to do this).

A demonstration of all of this is available in my example project.

Using CoreMIDI

CoreMIDI itself isn't too complex if you're happy to read the headers and work out what's going on. You need a basic grasp of how Mac OS's Core Foundation works, to understand lifetime management issues and to access string properties, etc.

CoreMIDI has a few basic concepts: most importantly clients, devices (with endpoints) and ports. The C APIs let you enumerate these, and register a notification to keep abreast of changes in MIDI connection state.

MIDI inputs are "sources" in CoreMIDI. MIDI outputs are "destinations".

Given those basic facts, you should be able to read through my MidiInput class and figure out what's going on.

Parsing MIDI

My example program just spits out a stream of binary MIDI data. I'm not showing any parsing of the MIDI data stream here. This parsing isn't rocket science, but is another step you have to perform.

Contact me if you want to know more about parsing MIDI.


CJ said...

Cool! You seem to be on a roll of coincidentally writing about topics that I am googling for.

And I want to up vote a request for a follow up article on parsing and generating midi data :)

Anonymous said...

Pete!!! Thank you for opening a cornucopia of mislead knowledge!!! You are the man!!! Thank you!!!Thank You!! Thank you! This example helped me beyond belief! Now the sending part...haha!

Anonymous said...

Wow, that works spectacularly well!
Trying it in my own app results in an error at the:
packet = MIDIPacketNext (packet) ;
line... Expected ')' before '&' token.
This kind of error is usually not literal...I'll keep investigating why your version of the same code compiles fine.
Thanks a lot!

Alex said...

I'm having exactly the same error at MIDIPacketNext. Adapting my target build settings to those of MidiMonitor didn't help. If anyone solves this, please let us know.

Pete Goodliffe said...

This problem is that your view controller source files are objective C (*.m) NOT objective C++ (*.mm).

Just rename the file suffix and you'll find that your project builds fine.

Omenie - home of the Ellatron said...

Massive thanks for this proof of concept code - hugely invaluable, thanks - and check out Ellatron over CoreMIDI - - and thanks again!

Pete Goodliffe said...

Cool, I'm glad you found it useful.

I enjoyed playing with Ellatron HD some time ago. Glad to be of assistance!

Unknown said...

Nice proof of concept.

Does anybody know, if it is possible to send MIDI data straight to the Mac?

I want to plug my iPad to my Mac (USB-Dock Connector Cable) and send MIDI signals to Ableton (for example).
Can i do this with the dock-connector cable?

With the current MIDI monitor source, OSX doesn´t seem to recognize my iPad as a MIDI Device.

Pete Goodliffe said...

No, this is not currently possible. iOS doesn't work like that.

When you plug in the camera connection kit, the iPad acts as a USB master. When you plug the 30-pin cable to your Mac, it's being a USB slave.

Unknown said...

OK... thx.

So the only way to get MIDI to the Mac without such hardwareboxes like iConnectMIDI is to connect through WIFI an config a new session in the mac audio-midi settings. Is that right?

Pete Goodliffe said...

Yes, as far as I am aware.

Unknown said...

Ok... i will try to create a sample app like yours with a wifi connection. Hopefully the result is such a drop-in-and-go solution like yours. A nice project would be a wrapper around Apples coreMIDI for simple sending and receiving of MIDI data.

Dr. Michael Lauer said...

This is awesome, Pete -- thanks a lot! Works fine with my Novation Remote SL 37. Unfortunately it doesn't work on the simulator yet, but only on the device -- which is strange, since the keyboard itself works fine with Logic on the Mac. Thanks again!

Unknown said...

Thanks Peter, really helped me out.

Just some thoughts which might help others out...

Would the MidiInput class be more appropriate as a singleton? (i.e., shared instance)

Also there's a problem in
-(void) midiInput:(MidiInput*)input midiReceived:(const MIDIPacketList *)packetList; the example. The high priority thread calling this clearly doesn't have an NSAutoReleasePool so you need to be careful using certain Obj-C calls in there (e.g., @"MIDI received:") or create and release a pool in the function (perhaps a bad idea in the long run).

Finally setting up a Wifi session is as simple as putting this in MidiInput -(id)init

MIDINetworkSession* session = [MIDINetworkSession defaultSession];
session.enabled = YES;
session.connectionPolicy = MIDINetworkConnectionPolicy_Anyone;

You need also to...

Unknown said...

final comment should have ended:

#import &ltCoreMIDI/MIDINetworkSession.h>


Unknown said...

final attempt!

#import <CoreMIDI/MIDINetworkSession.h>

Dan Rosenstark said...
This comment has been removed by the author.
Dan Rosenstark said...

0x4d52's comments are great to allow connections. Here's how to connect forward:

MIDINetworkSession* session = [MIDINetworkSession defaultSession];
MIDINetworkHost *host = [MIDINetworkHost hostWithName:@"bonjourName" address:@"" port:5004];
MIDINetworkConnection *connection = [MIDINetworkConnection connectionWithHost:host];
BOOL result = [session addConnection:connection];
NSLog(@"what happened %d", result);

Anonymous said...

from MIDINetworkSession we have access to both source and destination endpoints, but since we didn't create them, how do we set up a listening function such as with MIDIDestinationCreate(...readProc...)?

Unknown said...

I have several Iphone apps that implement what appears to be MIDI sequencing technology and routing to a software synthesizer of some sort. Since there is currently no native support for MIDI or software synths in IOS, does anyone have any idea what kind of strategies are being used to "fake" it? Some apps emulate MIDI-type sequencing amazingly well.

jamesdlow said...

I found this really useful for understanding midi data:

Now what I can't seem to do is get network data working. Basically I want to be able to use my app as a mini input device to my computer over network. Ala:

I've added this to the init:
MIDINetworkSession* session = [MIDINetworkSession defaultSession];
session.enabled = YES;
session.connectionPolicy = MIDINetworkConnectionPolicy_Anyone;

But it doesn't seem to show up in the list of devices. Any ideas?

jamesdlow said...

Ah solved my own problem the code needs to be at the start of the init

- (id) init
if ((self = [super init]))
MIDINetworkSession* session = [MIDINetworkSession defaultSession];
session.enabled = YES;
session.connectionPolicy = MIDINetworkConnectionPolicy_Anyone;

jamesdlow said...

Oh and this link is good to for understanding midi:

Anonymous said...

thank you very much for this, I've been trying to get this midi business to work for an age, this has really cleared things up for me :)

Unknown said...

Great example brother.

Do you know if it's possible to get Network MIDi working in the iOS Simulator??

Wyldcatz said...

I can't seem to find the download link on your project page?

pierre said...

Thanks Pete, it was really useful.
By the way, I've first tested the code with an hardware controller connected and it worked well.
The only problem was the impossibility then to have any log, so I decided to go for the wireless solution. The original code wouldn't work, and I couldn't see any reason for that since you've taken care of enabling network connection. I finally understood that because you init the PGMidi object before setting the delegate, no delegate methods would be called for the network connection. I've found the following workaround:

PGMidi* tempMidi = [PGMidi alloc];
tempMidi.delegate = self; // or any other delegate object
midi = [tempMidi init];
[midi enableNetwork:YES];

Doing this, the delegate methods are called for every devices, including the network one.

Pete Goodliffe said...

Pierre, that's a bit of a clumsy trick :-) You're not guaranteed that the delegate property wouldn't be clobbered by the call to init (although, you can see the code, and be assured it won't in this particular case).

The correct approach would be to create the PGMidi object, attach a delegate, and then enumerate the CURRENT devices before any subsequent delegate events are sent.

You could iterate through all existing devices and call your delegate interface manually, if that's what you need.

Wyldcatz said...
This comment has been removed by the author.
Max said...
This comment has been removed by the author.
Max said...

Hi Pete and thank you for your code.
I can't receive midi... am i missing something? i can send very well data, but i cant receive.
thank you

pierre said...

"You could iterate through all existing devices and call your delegate interface manually, if that's what you need."

Actually I've only spend a couple hours figuring out how I could integrate Midi into my own C++ code, so I went directly to the quick & dirty solution.

Thanks your answer anyway, I'll now go for the clean and pretty integration ;-)

Anonymous said...

Pete, first of all, thank you for doing this! Max, I'm having the same problem. I'm using a keyboard with a midi cable, and the camera connection kit to the iPad. The sample app can send data to my piano just fine, but I can't seem to receive any data from the keyboard. Other apps such as the midi monitor by domestic cat works just fine, so I know the cables and the camera connection kit is working. I wonder why?!? :| This sample is so useful, if it can just receive data too.

Maurizio De Cecco said...


and many thanks for your Core MIDI example. The text you wrote in the page ("let me know if you are using the code in your project") make me think that it is ok for you if we use your code in our applications. But the code itself is published with the standard xcode header that says "All right reserved". As such, we are not really allowed to reuse it, Have you considered some kind of open source licence, or equivalent phrase added to the code to state that the example can be reused ?

I intend to write a MIDI message parsed plugged on your code, i am willing to contribute it in some way once it works.

cnco said...

I'm having the same problem as a couple of folks above - sending works fine, but not receiving. The midiRead: method in is being called, but the delegate method it's supposed to call doesn't get called.

Thanks for the demo. It's a big help.

Anonymous said...


Thanks very much for the code.

For the receiving events I found the problem. The delegate is 0 when the sourceConnection is made. I did this:
New init method with an argument for the delegate:

- (id) initWithDelegate:(id) delegateObject
if ((self = [super init]))
self.delegate = delegateObject;

midi = [[PGMidi alloc] initWithDelegate:controller];

Take care


Anonymous said...

It is very very helpful to me. :)
I read post and comments to solve my problem.
Thank you!

Nono said...

Hello Pete (and the folks in here). Sorry if this sounds stupid. How do I just grab the file(s) without setting up GIT? The link to MidiInput.h fails. Thanks in advance.

Pete Goodliffe said...

You can grab it easily enough.

Go to the Gitorious project. Click the "source tree button". In the sidebar, use the "Download master as tar.gz". Voila.

Nono said...

Thank you! Just grabbed them. :-)

Pete Goodliffe said...

Please note that I have updated this post as the location of the PGMidi repo has changed (by popular request) to GitHub:

Jay Payte said...

How can I ignore incoming virtual MIDI connections from my own app? I've tried to set the session name etc. any suggestions?

m4yu said...

Pete, thanks for your great PGMidi, im using your code in my current project - iFretless bass. The only problem that I can not see Sample tank virtual port in the destinations list? Can you explain me why?

joelp said...

So is there a CorMIDI version of Ellatron HD out there ANYWHERE? Very badly want this app for my live rig, any info helpful (

Anonymous said...

what special things are there to do to support devices like iRig being plugged in/unplugged.

do you have to call attachToAllExistingSources, is there an event that gets fired? I don't have an iRig so cant test, but turning my wifi on/off seems to be handled ok out of the box.

Anonymous said...

never mind above, just noticed:

- (void) midi:(PGMidi*)midi sourceAdded:(PGMidiSource *)source
source.delegate = self;

Unknown said...

IK Multimedia's sampleTank doesn't show up in the list of MIDI destinations. Do you know how to send MIDI events to it? MIDIBridge has a way to do it but I can't guess what method the developer uses.

Anonymous said...

PGMidi doesn't handle virtual midi ports very well at the moment. I've found!searchin/open-music-app-collaboration/virtual$20mididestinationcreate/open-music-app-collaboration/LgMyw0tDVEQ/PrBakmbIqMoJ to be a helpful resource.

Darren said...

Very cool example! What to do if it doesn't work? No way to connect hardware and debugger at the same time. Is there a "verbose" mode so that more app events are written to the iOS display? If not, suggestions for easiest way to implement? My device appears and then app appears to freeze, no further interaction possible.

Pete Goodliffe said...

No, there's no easy way to debug over wifi like this. However, you can redirect your app's internal logging output to a file for later harvest. You can even redirect NSLog itself, if that helps you. Or make a custom UITextView that displays your logging. (I have done all of these at some point)

Unknown said...

Pete, the link you provided to your MIDI class doesn't work. Could you please provide another link?

Anonymous said...

Hi, I do not know much english (use online translator)
may help me
core midi freamwork - Library
I connect button on ipad with a synthesizer?
how razedent streams?
how it all should look?

Anonymous said...

Just downloaded and tried to compile project in Xcode 5.0.1 and iOS 7.0 :-(

9 Issues... :-(

Attribute Unavailable - Full Screen at Launch on iOS versions prior to 3.2

Lexical or Preprocessor Issue - Variadic macros ate a C99 feature

Parse Issue - Compound literals are a C99 specific feature

Semantic Issue - Variable length arrays are a C99 feature

Apple Mach-O Linker Error - Linker command failed with exit code 1


Mike Gao said...

I am getting dropped simultaneous notes only if I am for example, playing a simultaneous chord coming out of my DAW.
If I am playing chords on a keyboard going into the app, there is no problem.
It doesnt seem to be a thread issue (I use performselectoronmainthread for GUI stuff) and I am NSLogging the incoming notes which dont come in / seems to be dropped.

0x121412 said...

My two cents in this great blog: if you want to understand deeper how MIDI Networking (aka AppleMIDI or RTP-MIDI) works with iOS and OSX, take a look here :

Jimmy said...

Hey Mike. Make sure you are processing all midi events in each packet. You can get more than one midi event in one packet. If the size of a packet is like, 6 bytes, you prob have more than one event.