Home > Uncategorized > Replace Apple MainStage with a custom Max/MSP implementation (Part 1)

Replace Apple MainStage with a custom Max/MSP implementation (Part 1)

I have a large live keyboard rig with  7 keyboards (including a wireless MIDI keytar), several pedalboards with MIDI,  as well as an Eigenharp and various other control surfaces. I also have several synth modules in a rack that also contains two MOTU 8-port MIDI interfaces and a MOTU 828mkIII/8Pre combo that is fed by an SSL X-Patch so that I have complete control over how audio is routed from place to place.

Historically I have used Apple MainStage to control my rig but even though I considered it to be brilliant in conception, it was never (and sadly still is not) 100% reliable. I have never gotten through a single rehearsal (never mind performance) without several glitches such as plugins randomly stopping as well as occasional stuck notes. Although I stopped using audio plugins and added a Muse Research Receptor to handle such things, even MIDI routing fails to work reliably. The Receptor was also surprisingly flakey as well, and subject to many reboots as well as occasional failure to respond to MIDI.

After considering the available alternatives, including switching to a Windows box as there are a couple of interesting alternatives there, last weekend I decided instead to bite the bullet and just develop my own MIDI routing environment. The main criteria was that it had to implement the MainStage indirection mechanism where you can define devices (keyboards, knobs, pedals and so forth) which respond to incoming MIDI but which can then control  other devices that want different MIDI values without having to be focused on knowing the actual MIDI data all the time (a key highlight differentiator that separated MainStage from other systems with similar functionality) and it had to be very easy to add new “patches” representing new songs.

I’ve been familiar with the Max programming environment almost since it was originally developed by Miller Puckette and it has become an extremely powerful tool after been further developed by David Zicarelli. Indeed, it is now integrated into Ableton Live and people are building all sorts of amazing devices with it.

While a full explanation of MIDI and Max is beyond the scope of this blog, there are numerous articles and a few books available if you want to get into the deep details. One excellent book I found recently is “Electronic Music and Sound Design: Theory and Practice with Max/MSP – Vol 1” by Cipriani and Giri. Highly recommended.

One unexpected but extremely gratifying consequence of the switch away from MainStage is that with the exception of a known problem with the G-Force VSM plugin, my Receptor has also been glitch free. I have not seen a single instance of failure since the change.

Defining MIDI input devices

Normally I’m a big fan of top-down development but it turns out that in the Max environment, you mostly have to develop bottom up as it’s not really practical to create placeholders. The first goal here is to define a Max patch that represents a generic MIDI input device and then leverage that to build higher level patches to represent specific instruments.


This is an example of a generic input device patcher. The inlet at the top is used to define the actual MIDI source (see below). The midiin object receives MIDI data from a source and passes it into a midiparse object where it is separated into different kinds of events. For example, note on/off events come out through the first outlet. The other connected outlets (left to right) send Aftertouch, Pitchbend, CC events and Program Changes respectively. You can also access polytouch and the MIDI channel but I didn’t need those for now so left them unconnected. Note that the values that actually come out through these ports const of a list of values that does NOT include the first MIDI status byte.


Here’s where the fun starts. Having saved the above as a named patch, it becomes available to be reused over and over again as an object in its own right. Here is a new patch that defines my Yamaha AN1x keyboard, which I only use as a controller.

When this patch is loaded, a message is sent automatically to the MIDI-Input-Device to tell it that it should listen to the MIDI port called AN1x Controller. Other than CC events, other events are just mirrored to outlets that will be routed to output devices later. The interesting piece of this patch is of course the handling of CC events. The AN1x has 8 knobs that (in my configuration) transmit CC messages from 41 to 48 respectively. There is also a sustain pedal connected that generates CC 64 messages, a modwheel that generates CC 1 messages and an expression pedal that produces CC 7 messages. For that last, it’s very important to understand that the fact that CC 7 typically represents volume is not relevant.

The input to the route object receives a message that consists of two numbers representing any CC number and its associated value coming from the MIDI-Input-Device. The output of the route object consists of just the value of the CC message but the outlet through which it appears depends on which CC event was received. For example, the values of CC41 events appear at the first outlet. The values of CC42 events appear at the second outlet. The value of CC7 messages appear at the second last outlet. Values for CC events that are not specified in the list of numbers after the object name appear at the last outlet and we don’t care about them.

The outputs go to send objects. A send object allows a message to be sent to some other object, called a receive object (duh) without using a physical connection. The receive objects don’t have to be in the same patcher. Send objects take a single parameter that you can think of as a radio frequency. Any receive object whose parameter is set to the same name (frequency) will respond to transmitted messages.

MIDI Output Devices

Now let’s switch over to defining target devices such as synthesizers that respond to MIDI.


This patcher is essentially the opposite of the MIDI-Input-Device. It is a generic MIDI output device that accepts incoming individual MIDI events of different kinds (notes, pitchbend, CC, etc) and formats them into a single stream through the midiformat object that can be sent to an external device. Now lets implement a single channel of a real MIDI output device using this patcher.


This patcher represents channel 1 of my Muse Research Receptor, which is essentially a Linux box that can run Windows VSTs with high quality audio output. I have similar patchers defined for the Receptor for different MIDI channels. The first few inputs are usual, notes, pitchbend, aftertouch and so forth. The last one is of course the one of interest. Each of these blocks can receive a single stream of values. Looking at the second one (for example), what happens is that the pack object combines the value 7 along with whatever value is received and sends that tuplet into the CC port of MIDI-Output-Device, thereby generating MIDI CC 7 events to channel 1 of the Receptor. Now remember that we defined some knobs in the AN1x (earlier above) that produce a single stream of values as a knob is turned. Those knobs pass those values into those send objects that have names.

Creating our first song patch.

Here’s where the fun starts. Having created the items above, we can now create a Max patcher that represents a MainStage patch. Here is a trivial (yet working) example of a complete patch.

The first connection from the AN1x to the Receptor passes incoming notes played on the AN1x into the Receptor on channel 1. The second connection enables pitchbend messages to be sent as well.  The third connection causes the Receptor to change its volume as knob5 on the AN1x is turned.


In part 2, I will address functionality such as automatically sending program change information to multiple devices when a song patch is open. We will also explore some user interface features that allow you to create front panels that provide much the same information as MainStage.

For example, here is a picture of the display I created that shows the view of a Yamaha AN1x, a Prophet ’08, and an Akai MPK61 controller after opening a Max songpatch called Red Rain.

Categories: Uncategorized Tags:
  1. No comments yet.
  1. No trackbacks yet.