Archive

Archive for July, 2011

Replace Apple MainStage with a custom Max/MSP implementation (Part 2)

July 26th, 2011 Comments off

In part 1 of this article, I introduced some of the basic Max patchers I created such as generic MIDI input and output devices and then instrument-specific versions. After a few days of testing, I ran into some bizarre behavior where Scorecerer would not change pages via MIDI remote control, but only for some songs.

I finally tracked the problem down and it turned out that there was a ‘double-bug’, one in Scorecerer and one in my Max implementation. These conspired to cause the problem. What was happening was that when I pressed the button on my MPK61 to go to the next page in Scorecerer, Scorecerer didn’t respond at all but the (highly recommended) MIDI Monitor app I was running reported that two copies of the CC event were being sent to Scorecerer. I’ll get back to the reason two copies were being sent in a moment as that caused me to change the original design of the MIDI device patches.

The reason Scorecerer didn’t respond at all (as opposed to changing the page twice, which is what should have happened) is because I was (stupidly) using the length of the incoming packet to decide whether I was getting a CC event (3 bytes) or a ProgramChange event (2 bytes). However, in this particular case, the packet length was 6 (two CC events) and so was just being ignored. I fixed this by switching on the actual Status byte but I’m glad I found this issue as if it had been reproduced by customers but not by us, it would have been very difficult to track down.

Let’s get back to the reason Max was (sometimes) sending the event twice. Max has a very nice feature where, rather than having to use a single instance of a particular object, you can instantiate it multiple times and place it conveniently. For example, suppose you want an incoming note to be played on two synthesizers, with one of them playing the note transposed up by 7 semitones. You could use the following simple patch.

This will work absolutely fine but if you decide that you want to perform more sophisticated operations along the way, the view will get complicated due to too much entanglement. A better solution is to use the following patch.

In this version, even though each noteout object refers to different output devices (not shown here), both notein objects can refer to the same input device. As you will see when I show a patcher for a complete song further down, this separation can make it much easier to view and understand the patcher.

Now, if you recall from part 1, I had defined a generic MIDI In patcher and then created keyboard specific devices with it. In particular, those devices had some explicit outlets (representing notes, pitchbend and aftertouch) and used multiple send objects to make incoming CC events with different numbers available easily to multiple receivers. Therein lies the problem. Take a look at the following patcher, representing the input of my Yamaha AN1x.

As you can see, it has some outlets for notes, pitchbend, aftertouch and it also has a collection of send objects used to send out data streams for the knobs and pedals. Now, consider what happens if you create two or more of these patchers. There’s no impact on wired connections (if you don’t connect anything to the outlets, nothing happens) but anytime a CC event occurs, it will be sent out as many times as there are patcher instances. Some of my song patchers had the instrument device instantiated more than once (for the reasons described earlier above) and that’s what led to the problem.

After finally figuring out what happened, I changed the design slightly to address this problem. The first thing I did was get rid of the explicit outlets completely and replaced them with send objects. So the AN1x input patcher now looks as follows:

Now, every item of interest uses the send/receive mechanism. The second design change was to create a patcher that contained a single instance of all my input devices. This is called PhysicalInputDevices. Here it is.

As you can see, this patcher contains one object for each MIDI input device of interest. A  single instance of this patcher is created after which point all outputs of interest from all of these devices are immediately available wherever they are needed by just inserting a receive object into any patcher and using the desired argument name.

Songs

Now that the basic construction has been explained, lets take a look at an actual song patcher. Although I’ll include an entire song patcher here just so you can see the whole thing, I’ll then just focus on pieces of it at a time to explain how it works. A song patcher represents the same functionality as an individual MainStage patch and so there are multiple song patchers, each representing (typically) the required functionality for specific songs. Here is a patcher that I use with my band to perform the song All I Need is a Miracle (Mike and the Mechanics). Take a few moments to just look around the patcher.

Phew! OK — let’s look at this thing piecemeal and all will become clear.

The loadbang object is triggered automatically when a Max patcher is opened. The loadbang is connected to a send object with an argument called BigBang and a print object. The latter is just used for debugging and writes information to a console window. Various receive objects both within this song patcher as well as in some other auxilliary patches (such as the FrontPanelDisplay which provides a GUI similar to a MainStage layout showing what knobs and sliders are active) will be triggered as a consequence, allowing various initialization to occur as we will see shortly.

Sending program changes to external devices

Every song patcher contains a SystemProgramChange patcher to make it easy to send program changes to all devices in one go.

Here, we have six messages, each of which consists of a single number. Those numbers happen to be the MIDI Program Change values for my external devices for the Miracle song. The SystemProgramChange patcher encapsulates my external devices and exposes just the ProgramChange parameter of each of those external devices. Here’s what it looks like under the covers.

Note that because some synthesizers use 0 for their first patch and others use 1, I’ve enabled some of the devices to subtract 1 from the incoming value. That allows me to use the same values as displayed by the devices regardless of whether they start from 0 or from 1.

 

From input to output – our first sound

Lets now take a look at one of the playblocks (my term for a collection of objects that actually let you play one of your keyboards and get some sounds out of it).

This is the closest to a combination of a MainStage external MIDI channel strip coupled with the MIDI input section inspector. This is where you can define a routing from a keyboard (MIDI input device) to a synth (MIDI output device). Here we are using the AN1x to control channel 4 of the Receptor which has received a program change (see above) so that it produces a “pad” sound. On the right, we have connected Knob1 to the Volume inlet of the Receptor channel. Next to it, we have connected the pitchbend wheel of the AN1x to the pitchbend inlet of the Receptor (MIDI channel 4). Unlike in MainStage where some connections happen automatically unless you explicitly block them (great for simple setups but not so good for anything complex), in this Max environment, if you don’t make the connection, you don’t get any result.

The chunk on the left is a little more interesting. You can see that we are receiving incoming notes from the AN1x, but they are not going directly to the Receptor. Instead, they are going through another custom object called KeyboardLayer. Let’s look at that object.

All the objects here are built-in Max objects. Please refer to the Max documentation for deep details of what’s going on here. The left parameter takes a list of two values representing a note (the note number and the velocity value). The right parameter takes a list that represents the lowest allowed key, the highest allowed key and a transpose amount). The unpack object breaks up an incoming list into separate values, and the optional arguments define the type of those values. The default is two integers. So the note number and velocity are extracted from the incoming note. We don’t care about the velocity value and so it simply gets passed on to the pack object where (along with an eventually arriving note number) it will be combined back into a note and sent out.

The split object determines what range of values are allowed through its left outlet. The default (as defined by the arguments) is to allow all MIDI note numbers through. However, if you send it min and max values to the two right outlets, then it will only allow notes through that are within that min and max range. Any note number that gets through is added to a transpose value if provided.

Referring back to that playblock, it should now be clear that all notes on the AN1x are sent to the Receptor but transposed up by 12 semitones. The BigBang message, triggered by the loadbang when the song is loaded, is used to initialize the desired range and transpose for the specific KeyboardLayer object.

Splits and layers

If you look at the playblocks for the Polymoog and Piano in the song patcher above, you can see that separate sections of the Akai keyboard are being used for each of those sounds, i.e, MIDI notes 36 to 53 inclusive are sent to a piano sound (transposed up one octave) and MIDI notes 84 to 96 inclusive are sent to the Polymoog sound. By adjusting the ranges, you can configure any desired splits and/or layers that you need.

In Part 3, we will introduce a simple mechanism to implement velocity mapping and also talk about the consolidated front panel through which one can see quickly what knobs and sliders are associated with what parameters for the various devices.

Categories: Uncategorized Tags:

New version of Scorecerer for iPad has MIDI support

July 21st, 2011 Comments off

We have just submitted a new release of Scorecerer iPad to the App Store. Version 5 has support for CoreMIDI and has three new functions
1) Selecting a song can send a program change to your DAW or live rig manager
2) Scorecerer iPad can open a song automatically upon receipt of a program change

Typically, you would use either 1 or 2 depending on whether you want Scorecerer to be the master or the slave in a live environment.

3) Scorecerer iPad can respond to MIDI CC events and change to the next or previous page of a song. This lets you use buttons on a keyboard or a pedal attached to a keyboard to change the page.

I’ve been using these features with my band for a while now and they are very effective.

Hopefully, Apple will approve the new update quickly.

Categories: Uncategorized Tags:

Replace Apple MainStage with a custom Max/MSP implementation (Part 1)

July 20th, 2011 Comments off

I have a large live keyboard rig with  7 keyboards (including a wireless MIDI keytar), several pedalboards with MIDI,  as well as an Eigenharp and various other control surfaces. I also have several synth modules in a rack that also contains two MOTU 8-port MIDI interfaces and a MOTU 828mkIII/8Pre combo that is fed by an SSL X-Patch so that I have complete control over how audio is routed from place to place.

Historically I have used Apple MainStage to control my rig but even though I considered it to be brilliant in conception, it was never (and sadly still is not) 100% reliable. I have never gotten through a single rehearsal (never mind performance) without several glitches such as plugins randomly stopping as well as occasional stuck notes. Although I stopped using audio plugins and added a Muse Research Receptor to handle such things, even MIDI routing fails to work reliably. The Receptor was also surprisingly flakey as well, and subject to many reboots as well as occasional failure to respond to MIDI.

After considering the available alternatives, including switching to a Windows box as there are a couple of interesting alternatives there, last weekend I decided instead to bite the bullet and just develop my own MIDI routing environment. The main criteria was that it had to implement the MainStage indirection mechanism where you can define devices (keyboards, knobs, pedals and so forth) which respond to incoming MIDI but which can then control  other devices that want different MIDI values without having to be focused on knowing the actual MIDI data all the time (a key highlight differentiator that separated MainStage from other systems with similar functionality) and it had to be very easy to add new “patches” representing new songs.

I’ve been familiar with the Max programming environment almost since it was originally developed by Miller Puckette and it has become an extremely powerful tool after been further developed by David Zicarelli. Indeed, it is now integrated into Ableton Live and people are building all sorts of amazing devices with it.

While a full explanation of MIDI and Max is beyond the scope of this blog, there are numerous articles and a few books available if you want to get into the deep details. One excellent book I found recently is “Electronic Music and Sound Design: Theory and Practice with Max/MSP – Vol 1” by Cipriani and Giri. Highly recommended.

One unexpected but extremely gratifying consequence of the switch away from MainStage is that with the exception of a known problem with the G-Force VSM plugin, my Receptor has also been glitch free. I have not seen a single instance of failure since the change.

Defining MIDI input devices

Normally I’m a big fan of top-down development but it turns out that in the Max environment, you mostly have to develop bottom up as it’s not really practical to create placeholders. The first goal here is to define a Max patch that represents a generic MIDI input device and then leverage that to build higher level patches to represent specific instruments.

Midi-Input-Device

This is an example of a generic input device patcher. The inlet at the top is used to define the actual MIDI source (see below). The midiin object receives MIDI data from a source and passes it into a midiparse object where it is separated into different kinds of events. For example, note on/off events come out through the first outlet. The other connected outlets (left to right) send Aftertouch, Pitchbend, CC events and Program Changes respectively. You can also access polytouch and the MIDI channel but I didn’t need those for now so left them unconnected. Note that the values that actually come out through these ports const of a list of values that does NOT include the first MIDI status byte.

Yamaha-AN1x-C1

Here’s where the fun starts. Having saved the above as a named patch, it becomes available to be reused over and over again as an object in its own right. Here is a new patch that defines my Yamaha AN1x keyboard, which I only use as a controller.

When this patch is loaded, a message is sent automatically to the MIDI-Input-Device to tell it that it should listen to the MIDI port called AN1x Controller. Other than CC events, other events are just mirrored to outlets that will be routed to output devices later. The interesting piece of this patch is of course the handling of CC events. The AN1x has 8 knobs that (in my configuration) transmit CC messages from 41 to 48 respectively. There is also a sustain pedal connected that generates CC 64 messages, a modwheel that generates CC 1 messages and an expression pedal that produces CC 7 messages. For that last, it’s very important to understand that the fact that CC 7 typically represents volume is not relevant.

The input to the route object receives a message that consists of two numbers representing any CC number and its associated value coming from the MIDI-Input-Device. The output of the route object consists of just the value of the CC message but the outlet through which it appears depends on which CC event was received. For example, the values of CC41 events appear at the first outlet. The values of CC42 events appear at the second outlet. The value of CC7 messages appear at the second last outlet. Values for CC events that are not specified in the list of numbers after the object name appear at the last outlet and we don’t care about them.

The outputs go to send objects. A send object allows a message to be sent to some other object, called a receive object (duh) without using a physical connection. The receive objects don’t have to be in the same patcher. Send objects take a single parameter that you can think of as a radio frequency. Any receive object whose parameter is set to the same name (frequency) will respond to transmitted messages.

MIDI Output Devices

Now let’s switch over to defining target devices such as synthesizers that respond to MIDI.

Midi-Output-Device

This patcher is essentially the opposite of the MIDI-Input-Device. It is a generic MIDI output device that accepts incoming individual MIDI events of different kinds (notes, pitchbend, CC, etc) and formats them into a single stream through the midiformat object that can be sent to an external device. Now lets implement a single channel of a real MIDI output device using this patcher.

Receptor-Rx1

This patcher represents channel 1 of my Muse Research Receptor, which is essentially a Linux box that can run Windows VSTs with high quality audio output. I have similar patchers defined for the Receptor for different MIDI channels. The first few inputs are usual, notes, pitchbend, aftertouch and so forth. The last one is of course the one of interest. Each of these blocks can receive a single stream of values. Looking at the second one (for example), what happens is that the pack object combines the value 7 along with whatever value is received and sends that tuplet into the CC port of MIDI-Output-Device, thereby generating MIDI CC 7 events to channel 1 of the Receptor. Now remember that we defined some knobs in the AN1x (earlier above) that produce a single stream of values as a knob is turned. Those knobs pass those values into those send objects that have names.

Creating our first song patch.

Here’s where the fun starts. Having created the items above, we can now create a Max patcher that represents a MainStage patch. Here is a trivial (yet working) example of a complete patch.

The first connection from the AN1x to the Receptor passes incoming notes played on the AN1x into the Receptor on channel 1. The second connection enables pitchbend messages to be sent as well.  The third connection causes the Receptor to change its volume as knob5 on the AN1x is turned.

 

In part 2, I will address functionality such as automatically sending program change information to multiple devices when a song patch is open. We will also explore some user interface features that allow you to create front panels that provide much the same information as MainStage.

For example, here is a picture of the display I created that shows the view of a Yamaha AN1x, a Prophet ’08, and an Akai MPK61 controller after opening a Max songpatch called Red Rain.

Categories: Uncategorized Tags:

‘Else’ is not the counterpart of ‘If’

July 9th, 2011 Comments off

I can’t stand it anymore. I honestly don’t understand why some developers (particularly those coming from the C world) continue to vertically line up else underneath if

Nobody would ever write *

switch (expression) {
   case true:
      DoSomething;
      break;

case false:
   DoSomethingElse; 
   break;
}

So write

if (expression)
   then DoSomething;
   else DoSomethingElse;

If you’re writing in C or C++, use

#define then 

so you can use the keyword anyway. If your language doesn’t allow macros, then just indent as if there was a then keyword there anyway.

if (expression)
        DoSomething;
   else DoSomethingElse;

That last might look a little odd by itself but once you need multiple statements inside the then and/or else, it makes total sense again.

if (expression)
       {
            DoLotsOfThings;
       }
   else
       {
          DoLotsOfOtherThings;
       }

 

 

* By the way, I would normally indent everything under the switch as well but I’m ignoring that here to avoid an orthogonal issue about block structures. In case you’re wondering:

switch (expression)
   {
      case true:
         DoSomething;
         break;

      case false:
         DoSomethingElse;
         break; 
   }
Categories: Software Development Tags:

Upgrading my live keyboard rig – the good, the bad, and the ugly

July 2nd, 2011 Comments off

I use a pretty sophisticated live keyboard rig with my band, No Sleep Tonite, and have just finished a major overhaul whose goal was to reduce significantly the setup/teardown time from about 2 hours down to about 10 minutes.

For those of you interested in music technology, my live rig consists of six keyboards including a Korg Oasys, Roland VK88, Minimoog XL, Prophet ’08, Yamaha AN1x (used only as a controller) and an Akai MPK61 controller. I also have a Roland AX-1 keytar which I use occasionally with a wireless MIDI setup, and more recently, I’ve started playing the Eigenharp (but that’s a whole ‘nother story). Of course I also use Scorecerer on an iPad (shameless plug) to view my sheet music, notes and setlists. There are also about 9 footpedals in the rig, for audio volume and MIDI CC control used for volume, expression, sustain and so forth.

Everything is managed through Apple MainStage although I use very few AU plugins with it and instead added a Muse Research Receptor for that. MainStage is mostly responsible for MIDI control (it sends out all program changes) along with routing, layering and splits. It also enables knobs, sliders and buttons on several of my boards to control parameters of any sound in real-time as needed.

Until the overhaul, the VK-88 audio was hardwired into a Line-6 M13 effects processor and the Minimoog audio was hardwired into a Boss RE-20 Space Echo. All the audio outputs were sent to an A&H 20 channel mixer and a two-channel mix goes to FOH. Connecting everything up was always a headache with a maze of cables. The plan was to build harnesses to carry power, audio (in and out), MIDI (in and out) from a new rack to each device (including pedals/effects, which would be mounted on pedalboards).

Mike Vegas, a rather amazing and experienced musician/engineering craftsman took a look at my rig and made a couple of fascinating suggestions the most important of which was to eliminate the A&H mixer (and a 12U rack) and replacing it with a MOTU 828MkIII/8Pre in conjunction with an SSL X-Patch. All audio devices and my effects units would be connected directly to the X-Patch. The outputs from the X-Patch would go into the 828/8Pre combo and finally a stereo feed would go from the 828 to FOH. A couple of MOTU 128s were to be used to handle all MIDI connections.

So a few weeks ago, and with more than a little trepidation, I disconnected everything and sent my rig to Mike’s workshop. Last Thursday evening, Mike called to say everything was ready for me to test so off I went. Note that Mike had tested all the physical connections but my laptop was needed to be able to actually configure everything.

Since the Receptor and the X-Patch both use Ethernet, Mike also installed a wireless  router inside the rack so that there is a private subnet for the system. The Mac, Receptor and X-Patch are connected with cable and the iPad can connect over wifi. Wifi on the Mac itself is disabled during “play time” as it’s not clear how well wifi works with all the other real-time stuff is going on. However, because I still want the Mac to be able to access the internet over wifi and most wifi routers are configured to use 192.168.1.x, I setup the router to use a different subset, 192.168.3.x and that, it turns out, is what led to the first “show stopper” problem.

I downloaded the X-Patch remote configuration app and ran it. It immediately detected the X-Patch, displayed its firmware and let me select it. The next step was to configure networking. I noticed that it was set to DHCP by default so figured I was done, as the router was configured to hand out DHCP addresses. So then I went to the channel configuration section where I’m supposed to enter the names of all the devices connected to the X-Patch. I entered the name of the first device into the first box and then clicked on the next box to enter the second name. As soon as I did that, the name I had typed into the first box disappeared.

To cut a long story short, it turned out that nothing I entered anywhere would stick. I wondered whether, in spite of the app indicating that the X-Patch was online, they were not actually communicating. So next, I checked the active DHCP leases on the router and noticed that only my Mac and my iPad were showing up (the Receptor was not turned on). So the X-Patch DHCP client was apparently not working. So I went back into the X-Patch app and clicked on “Static IP” and defined an address, subnet and gateway on the .3 network. After doing that, the app told me that I should power-cycle the X-Patch and then restart the app again.

I did that and…..nothing! Stuff I typed in was still disappearing as soon as I switched to another field. When I went back to the networking section, it was set back to DHCP and the static IP information I had previously defined was gone.

I was now the proud owner of a keyboard rig which made absolutely no sounds. John Cage would have been proud! Went home that night pretty discouraged. Although I did manage to get hold of someone at SSL the next morning but their only real suggestion was that the router was probably faulty. I found that hard to believe since other devices connected via that router were working just fine.

When I went back to the workshop, I had one idea which I thought I would try. I disconnected my Mac from the external network (which was on the .1 subnet) and then reconfigured the internet router so that it would use the .1 subnet instead of .3 and lo and behold, the X-Patch application worked perfectly. Great feelings of relief followed by some annoyance as to why so much time got wasted on that issue. I’d have figured the problem out in 5 minues if the X-Patch application simply said it couldn’t connect to the unit. Not sure what one does if they’re experienced with network troubleshooting.

So the fundamental problem was that even though it looked like the X-Patch application was connecting to the hardware (it said the hardware was “online”), it really wasn’t connected and the only way to make it work was to reconfigure the router to suit the X-Patch. I have no idea why it couldn’t pick up a DHCP connection in the first place unless the X-Patch comes from the factory with a static IP address as the default and since the application couldn’t connect, it wouldn’t even tell me that.

By the way, now that I have it working, it’s quite a wonderful device. There’s a very tiny audible click when I switch patches, but nothing to be concerned about and it’s delightful to be able to use effects pedals on different instruments at different times and not be committed to a single permanent routing.

Categories: Uncategorized Tags: