Archive for the ‘Software Development’ Category

Ground rules for being a Gig Rack beta tester

September 10th, 2016 No comments

Our beta testing program is a great opportunity for us to discover and fix bugs and important features and for our beta testers to get to know and influence the creation of a new software product before it is released to the general public.

Here are some basic guidelines on how to help us make The Gig Rack be the best application in its class.

  1. Make sure you are using the latest beta version available. Always check for updates.
  2. Please include as much information as you can about any problem you encounter. We need detailed information so that we can track down the problem. For example,
    • What you were trying to do when it crashed
    • What error messages (if any) appeared – be very specific. The exact content of an error message can be extremely helpful.
  3. Can you reproduce the error? Tell us in detail the steps we need to follow to reproduce the problem. If the problem is intermittent, how often does it happen? Can you see any patterns in how you were using the program when it crashed?
  4. Details are critical. For example,
    • Does your problem occur with a specific plugin or with any plugin?
    • Is it a VST or AU plugin?
    • Is it an old plugin or the latest version? If it’s an older version, what happens if you update to the latest version
    • Is your plugin legal? We’re not going to spend time trying to debug a problem that is due to consequences of cracking plugins.
  5. Tell us about your environment. For example,
    • Confirm what version of The Gig Rack you are running
    • How much RAM is in your computer
    • What version of the OS are you running
    • What kind of audio interface are you using
    • Are your keyboards connected by MIDI or USB
  6. What can we do to improve the Gig Rack to make it even better for your needs? For example,
    1. Are there features we include that you would like to see done differently? If so, how?
    2. What features would you like to see that are currently not available?


My keyboard rig for the Security Project

August 16th, 2012 No comments

The Security Project was created early in 2012 to perform the early music of Peter Gabriel. A key feature of this project was the inclusion of musicians who performed or were otherwise involved with Peter Gabriel the first time around, about 30 years ago. I was invited to join this project a couple of months later and we just performed our first show at B.B. King in NYC on August 11th.

Update (Feb 12th, 2013): The Security Project just completed a short tour in the Northeast in four states, ending with a second performance at B.B. King in NYC.

Update (April 16th, 2014): The Security Project just completed another tour that included about 12 shows in the Northeast and a couple of shows in Canada (Montreal and Quebec City). A highlights video from that tour can be found here.

A number of people have asked me to describe the keyboard environment I am using for this project. Given that there is in fact much more going on than can be seen from just looking at it from a distance, I figured it was time to write it up.


I am using four physical keyboards, set up in an L shape. An important point to understand is that there is no correspondance between any particular keyboard and any particular sound that is heard.

On my right is an Akai MPK88 weighted MIDI controller underneath a Yamaha AN1x. The Akai is nominally used for piano parts but does get used for other parts where necessary. For example, in Family/Fishing Net, I am controlling the low blown bottle sound that appears at the very beginning, the flute loop that also appears at the beginning, and then later the picolo. I’m also playing bass on it at the points where Trey plays solos. The AN1x, although nominally a full synth, is used only as a MIDI controller and its internal audio is not connected. The sliders on the Akai and the knobs on the AN1x are used to control volume and/or other real-time effects as needed.

On my left is a single manual Hammond XK3-C underneath a Korg Kronos. The Hammond is often used solely as a MIDI controller and only occasionally is its internal sound engine used, for example in the early part of Fly On A Windshield and on Back in NYC. The Korg Kronos is mostly used as a synth engine and the sounds it produces are often being played from some of the other controllers. Occasionally, I play the Kronos keyboard itself but as often as not, in that mode, the sounds that are heard are actually coming from somewhere else. (I’ll get to “somewhere else” in a moment)

Update(December 15th, 2012) : The Hammond XK3-C has now been replaced by a Nord C2D (a dual-manual organ) and provides much more flexibility in terms of both organ playing as well as effectively giving me two MIDI controllers which is very useful.

Update(July 7th, 2014): While I am still using 5 keyboards, the Kronos 61 and Nord C2D have been replaced by three Roland A800 Pro controllers. The bottom two are (by default) routed to the Gsi VB3 Hammond Emulator plugin on my laptop. The Yamaha AN1x has been replaced by a fourth Roland A800 Pro and the Akai weighted controller has been replaced by a KronosX 88 which now takes on both the roles of synth engine where needed and weighted controller.


On the floor, under each pair of keyboards, is a Roland FC300 MIDI pedal controller. As well as five foot switches, there are also two built-in expression pedals. Several extra expression pedals are also plugged into the unit. These pedals perform different operations, depending on the song. For example, in Rhythm Of The Heat, foot switches are used to turn on or off the background single string note that is played throughout much of the song as well as the deep percussive Pitztwang note that comes in at the end of many vocal phrases. In Humdrum, the same footswitches are used to emulate the deep Taurus bass notes that are heard in the last section.


The Eigenharp is a highly sensitive controller that transmits high speed OSC data as keys are played. The keys on the eigenharp detect motion in three dimensions and are extremely sensitive, allowing guitar or violin style vibrato to be easily played. In San Jacinto, I am controlling the volume of the three marimba loops. I am also playing the string orchestra part in the middle (actual sound was coming from the Korg Kronos) and then the Moog bass and steampipe sounds at the end. Those last two are produced by two soft synths running on my laptop, more on this later.

iPads and iPhone

An iPhone and one of my iPads are used to run Lemur, an app that implements a touch sensitive programmable control surface (with sliders, buttons, knobs and so forth). These are used for real time control of various sounds. The iPhone was used to start the whole show from the back of the room, where a touch of a button triggered the Rhythm Of The Heat loop that is present though most of the song. It was also used to generate the same Pitztwang sound when I was not behind the keyboards and able to reach the pedal.

A second iPad runs Scorecerer, a product developed by my company that displays sheet music and annotations as well as sending commands back to the computer to change all settings as we move from one song to the next in the set list.

Update (November 2014): A third iPad has now been added whose sole purpose is to display a small portion of the laptop display, the part showing current knob assignments. The laptop itself is no longer raised up and so its screen is less visible.


On the floor between the two sets of keyboards (and under the computer stand) is a rack containing two MOTU 828 MkIII audio interfaces a MOTU MIDI Express XT (an 8 port MIDI interface), the base-station for the Eigenharp, a network ethernet/wifi router and a power supply for the entire system. Power, audio, MIDI and USB connections (as required) from keyboard controllers and pedals are connected directly into this rack. The MOTU 828s allow me to route each sound generator (e.g., VST instruments, VST effects, Max audio, external audio from the organ and Kronos)  to independent audio output channels that go to FOH and . It is also connected to an audio receiver that returns a feed of all the instruments (except my keyboards) from the monitor mix. That feed is then mixed back in with the keyboards so that I can control how much of my rig I hear relative to the rest of the band. The router is used to allow the iPhone and iPads communicate with the computer. The reason for  many audio outputs is so that different kinds of sounds can be EQ’d seperately and so that the volume of sequences can be controlled relative to other keyboard sounds for monitoring by other band members.

Update (Feb 1st, 2013): The latest version of the CueMix software that controls the MOTU 828s now responds to OSC for remote control. Consequently I have added a third iPad (iPad Mini) that runs TouchOSC with templates for the MOTU hardware. That allows me to easily adjust the volume of the band mix I’m receiving without having to interact directly with the computer. Ultimately, I’ll take some time to reverse-engineer the actual OSC data that’s being sent out after which I will be able to use MaxMSP (see next section) to send the same data, thereby allowing me to control that volume from a slider on one of my keyboards rather than having a third iPad.

Update (Jan 12th, 2014): The two MOTU 828s have been replaced with two RME UCX audio interfaces. The benefits are improved sound quality as well as reduced latency and improved driver performance. The CueMix software has been replaced by RME’s software called TotalMix for which there is also OSC support and an iPad app so everything else remains unchanged.


The entire system is completely controlled by a Macbook Pro running custom software I developed using the MaxMSP programming environment. A description of the custom software can be found in other blog entries on this site and the extensions I developed to integrate the Eigenharp into MaxMSP can be found here.

Everything that happens is processed through this software environment. When a song request is received (from the iPad running Scorecerer), MaxMSP will load all the required soft synths, set up the appropriate MIDI routings (i.e, which parts of which keyboards play which sounds), route the audio to the appropriate outputs on the MOTU device and respond to real time controllers (knobs, sliders, buttons and pedals as well as the Eigenharp) to control volume and other parameters (e.g, filter cutoff, attack, decay, reverb or whatever is needed) for the specific song.  MaxMSP is also responsible for generating the real-time loops, both MIDI and audio and in some cases is also adding extra harmonies depending on what I’m playing.

Here is an example of some of the configuration created in MaxMSP for San Jacinto.

Soft synths and effects

While some of the sounds heard are coming from the Korg Kronos and Hammond organ (even if they’re not being played from their respective keyboards), a variety of audio plugins are in use. The most important of these are Native Instruments Kontakt and Reaktor, the GForce Oddity and Minimonsta (yep, Arp Odyssey and Minimoog emulators respectively), and the AAS UltraAnalog. Effect processing is mostly done with Native Instruments GuitarRig and IK Multimedia AmpliTube. However, some effects are done directly with MaxMSP. I used to use Arturia plugins (I love their Moog Modular and Arp 2600) but I had too much grief with their copy-protection scheme and so had to drop them.


Jerry, Trey, Michael and Brian are amazing!

Larry Fast provided me with samples and loops for some of the songs and his guidance and insights as to how some sounds were originally created and performed was absolutely critical to recreating the experience.

Jim Kleban provided me with great information as well as RMI and ARP Prosoloist samples for the Genesis songs we performed from the Lamb album.

I’d also like to thank the support team at Cycling74 as well as many users on the Cycling74 support forums. MaxMSP was a core component in making this project workable.

Norman Bedford gets a prize for organizational expertise.

Finally, my heartfelt thanks to Scott Weinberger for inviting me into this project and giving me the opportunity to achieve one of my lifelong dreams.


I hope this information is of interest and please feel free to submit comments and questions. I’ll do my best to respond as time permits.

Using Max with the Eigenharp Alpha

April 26th, 2012 No comments

April 26th, 2012

A new website has been created that focuses specifically on using Max with the Eigenharp. Please visit for more details as well as for the Max patchers to support the Eigenharp.

Easy solution to installing Python Imaging Library on a Macintosh

August 10th, 2011 No comments

To make it easier for our customers to extract images out of PDF files without using our cloud-based PDF conversion process, I decided to investigate the use of the Python Image Library to build a utility program.

After downloading the source code, I tried to build it and soon ran into errors such as the following:

lipo: can't open input file: /var/folders/.........(No such file or directory)
error: command 'gcc' failed with exit status 1


lipo: can't open input file: /var/tmp//ccptvQXL.out (No such file or directory)

Turns out this was happening because my system doesn’t have the ppc binaries (nor do I want them) and one of the options being passed to the gcc compiler is

 -arch ppc

I looked around on the web and found a few pages that described this error.

and and

However, the suggested solutions were all rather painful or awkward. I decided to trace into the file to see from where it was getting the compiler options. To cut a long story short, I discovered very quickly that there’s an environment variable called ARCHFLAGS and if that is defined, then it is used before other mechanisms.

So I simply typed the following into the terminal window

export ARCHTYPES="-arch i386 -arch x86_64"
python install

and everything just built and installed with no problem whatsoever.

If you do this kind of thing often, it might be worth setting the environment value in your .bash_rc or .profile

Categories: Software Development Tags:

‘Else’ is not the counterpart of ‘If’

July 9th, 2011 No comments

I can’t stand it anymore. I honestly don’t understand why some developers (particularly those coming from the C world) continue to vertically line up else underneath if

Nobody would ever write *

switch (expression) {
   case true:

case false:

So write

if (expression)
   then DoSomething;
   else DoSomethingElse;

If you’re writing in C or C++, use

#define then 

so you can use the keyword anyway. If your language doesn’t allow macros, then just indent as if there was a then keyword there anyway.

if (expression)
   else DoSomethingElse;

That last might look a little odd by itself but once you need multiple statements inside the then and/or else, it makes total sense again.

if (expression)



* By the way, I would normally indent everything under the switch as well but I’m ignoring that here to avoid an orthogonal issue about block structures. In case you’re wondering:

switch (expression)
      case true:

      case false:
Categories: Software Development Tags: