The state of MIDI on iOS

The state of music apps on iOS has been evolving at a steady pace in the last year or so. iOS 4.2 brought the CoreMIDI API into iOS. Previously, MIDI communication was only possible using proprietary hardware and proprietary software, which limited adoption. Now, with CoreMIDI, a wide variety of hardware is supported through the Camera Connection Kit and through MIDI accessories for the iPhone and iPad. CoreMIDI also brings standard support for MIDI communication between apps running on the same device and for Network MIDI communication over WIFI and bluetooth. Supporting proprietary SDK’s will be less and less important and CoreMIDI will emerge as the new standard.

Over the past year, many apps have started implementing CoreMIDI and Network midi for MIDI Sync and for control. Having CoreMIDI means that music apps on iOS can start to be taken more seriously by real musicians; I’ve heard many musicians dismiss iOS apps because they don’t fit in with existing workflows or because quality is not up to desktop standards. Those people may never be the core users of iOS apps, but the music app space has gotten much better both in terms of quality, control and integration.

Advantages of CoreMIDI

The biggest advantage of CoreMIDI is the ability to synchronize MIDI events going to different destinations. One could imagine how difficult it would be to synchronize MIDI events traveling over a WIFI network with unstable latenciess, hardware interfaces and apps all at the same time. CoreMIDI handles all of this behind the scenes. All that is necessary for the developer is to prepare MIDI event streams a couple hundred milliseconds in advance, give the events proper timestamps and the OS handles the rest. CoreMIDI also makes it very easy to create Virtual ports for communicating between applications; these ports work just like hardware ports meaning there is not much extra work for the developer.

The emerging inter-app MIDI standard (Open Music App Collaboration)

Ever since the first iOS music apps were released, users have asked how they could synchronize two apps so they could be used alongside each other. Using CoreMIDI, this is now achievable in a standard, reliable way, without any hacks or special work needed from users. The emerging standard set of features that apps should support are:

Apps should also begin supporting the following features so that music creation on iOS can become fully integrated:

  • Support for MIDI Volume and Pan CC’s
  • Support MIDI Channels. Allow your app to just listen to specific channels or support multiple patches on multiple channels
  • Support Program Change messages where appropriate

By supporting all these features, we could have one application acting as a sequencer for multiple synthesizer or drum machine applications. This is one of my goals with Genome.

iOS MIDI: Possible Pitfalls

CoreMIDI holds great promise for expanding the creative potential of iOS, however there are still a number of pitfalls that need to be worked out by developers. Routing MIDI between apps has the potential to be a confusing experience for users. With the ability for every app to communicate with every other app, MIDI loopbacks can occur and there is no single ‘top down’ way to manage connections between applications. Also, I have seen developers take many different routes for how to present the MIDI connection dialogs to the end user. Arriving at a standard way of doing things will take time, but it’s something that needs to happen.

Another issue is Multitasking. Early iOS versions did not have Multitasking and it’s still a slow, frustrating experience for users. Other OS’s like WebOS do a better job with it. I’ve heard many people say that trying to do anything involving several apps is unproductive and difficult; I couldn’t see myself trying to use more than 3-4 music apps at once because it would take too long to switch between them just to tweak a setting or two. Hopefully Apple will be able to offer better ways to switch between apps in the future.

On the same front, we will also have to deal with the issues of apps playing nice with each other when they were originally written to be the sole app running. Some apps may take up too much CPU or do ‘impolite’ things with changing system or device settings that may mess up other apps. Again, I think time and standardization will fix these.

Developers also have a big hurdle to clear with educating users. We currently have four standards (not just one): CoreMIDI is the lowest level, giving the ability to send MIDI thru the Camera Connection Kit or CoreMIDI accessories. Next is Network MIDI, which means the ability to send MIDI over the network. Then we also have Open Music App Collaboration (OMAC), which allows apps on the same device to talk to each other. Just because an app supports CoreMIDI does not mean that it automatically supports Network MIDI; Developers need to do extra work to support Network MIDI and OMAC. This must be communicated in store descriptions, on websites, etc.

 

Looking to the future

The future of iOS music making is looking bright over the next few months. For the first time, users are able to make complex songs using several apps or can use a controller app to play a synthesizer app. iPads and iPhones can now play in the same ecosystem as desktop synths and MIDI hardware. Stay tuned ;)

For MIDI enabled music apps we recommend, see here.

and of course, our own app: Genome MIDI Sequencer  :)