Archive

Author Archive

Replace Apple MainStage with a Custom Max/MSP Implementation (Part 4)

February 2nd, 2012 5 comments

Well, it has been a few months since I wrote the previous articles on my development of a custom alternative to Apple MainStage. After using it for a few months, I started to figure out what else I really needed and as of today, I am comfortable that I have created an update worthy of being called version 2 (or I suppose I should call it version 2012!)

It’s going to take a while to finish this article and I’ve decided to make it visible along the way, otherwise I can’t say how long it will be before I finish it. Of course, having it visible before it’s finished will most likely act as a catalyst to get it finished ASAP.

The key new features of this version are

  • Persistence
  • Audio mixer
  • External audio routing
  • Better encapsulation

1) Persistence

The one feature that MainStage (and other similar systems) had that was missing in my library was the ability to remember the state of parameters so that the next time a song was loaded, any parameters that were changed in real time while the song was previously loaded would be automatically recalled.

2) Audio mixer

I use a MOTU 828 MkIII + MOTU 8Pre and up until now I was just using MOTU’s CueMix application to do basic volume control. However, CueMix has two flaws: a) no remote control over the mixer and b) no way to process audio through software (e.g, VST effects).

I am now explicitly routing all audio through Max objects. As you will see, there have been some nice benefits to this design. Further, I have not observed any latency issues, something that was of concern to me. The mixer has channel strips with the ability to send signals to VST effects.

3) External audio routing

The new mixer supports my external audio devices (keyboards, synths, guitars, vocals) and those can also be processed with VST effects. This will in fact allow me to get rid of quite a few external stomp pedals.

4) Better encapsulation

Among other things, I have taken the [GenericVST] object and wrapped it inside various new objects that automatically route audio to the desired mixing channel or effects send bus. It is also much easier to configure parameters of a VST and to initiate a VST edit/save cycle.

Consolidated front panel

Here’s a picture of the latest version of my consolidated front panel. You will note that it has changed significantly from the original version I described when I first started this effort. (Click the image to see a larger view)

Apart from all the extra stuff in there (compared to my original version), the key change is that the bpatcher-based panels that represent the control surfaces for some keyboards (Akai and AN1x) as well as the panel that represents the mixer (external, VST and effects) are constructed from more primitive objects at the bottom of which are UI objects whose state can be stored and reloaded centrally.

Persistence

While Max comes with several mechanisms for managing persistence (save/restore state of a coll, the pattrstorage system, the new dict objects in Max 6), the first was too simplistic, the second was too complicated and the third was not available for Max 5.1.9 and so not evaluated. However, my persistence needs were quite specific. If you turn a dial or change a slider position, directly or via MIDI from a control surface, I want the current value of that dial or slider position to be associated with the currently loaded song so that when the song is reloaded later, the value is restored, and sent out to whatever parameter is being controlled.

Turns out that the easiest way to do this was to use a [js] Max object. The [js] object is a way to write Javascript code that can live inside Max, complete with input and output ports and a way to respond to arbitrary incoming Max messages. Javascript trivially supports associative arrays so all I needed was a scheme to uniquely name every slider and dial, no matter where it lives. To see why this is the case, I need to explain how the panels and mixer controls are actually implemented. Let’s examine a simple case first, the 8 dials that make up the AN1x control surface. First, here’s that section of the consolidated front panel again.

This is actually being displayed through a [bpatcher] object, as indicated by the light blue line around it. (That line is only visible when the patcher is in edit presentation mode.) If I drill in (i.e, open up this bpatcher), then one of the items is just the 8 dials.

Similarly, drilling into this bpatcher, we can see that there are in fact 8 separate bpatchers as well as a comment field to which we will return later.

Drilling into any one of these, we find the underlying dial and integer in an object called pdial (persistent dial). This is where the fun starts:

Note that there is also a pslider (persistent slider) object and a plabel (persistent label) object both of which are implemented in the same manner. Note that at its core, the dial can receive values whose name is #2_pdial and it can send its value out as a packed name/value pair to something called #1.

Now, #1 and #2 represent the values of two arguments that are passed into the object when it is instantiated. The first argument (whose value will be the same for all objects used in a given environment) will be the name associated with a [receive] object that will pass all incoming values into a dictionary of some kind. We will see what that dictionary looks like later.

Creating unique names for user interface objects

The second argument is much more interesting. We need a mechanism where we can end up with unique names for different dials or sliders, each of which should have little or no knowledge of what else is there. To cut a long story very short, the solution I chose was to define at every level a name consisting of whatever is passed in from the parent level concatenated with whatever makes sense for the object in question. This is easier to demonstrate than to describe.

The second dial in the AN1x console receives messages addressed to a receiver with the name

AN1x1_AN1xConsole_2_pdial

Similarly, the 4th channel strip of the MOTU mixer receives messages addressed to

MainMixer_MOTUKeyboardsMixer_4_ExternalChannelStrip_ChannelStripVol_pslider

and the lower SEND dial for for the 5th channel strip is addressed by

MainMixer_MOTUKeyboardsMixer_5_ExternalChannelStrip_ChannelStripEFX2_EffectsSend_pdial

If you parse that last one, reading right to left, it says that there is a persistent dial in an EffectsSend object which represents the second EFX of a channel strip living in an ExternalChannelStrip, the 5th channel strip contained in the MOTUKeyboardsMixer which lives (along with other things) in the MainMixer patch.

The nice thing about this is that the only objects that have to know this full name are the bottom level persistent dials, sliders and labels themselves.

Connecting with the dictionary

A Dictionary object is inserted into every top level patcher that represents a song. (Remember that each such a patcher contains the MIDI routings and VSTs that are needed to play a particular song). Here’s a view of a Dictionary as contained in a song patcher.

The single parameter is the name that will be used by all persistent objects when they send their values. It can be anything you want. In my system, this name is DHJConsole.

Here’s the contents of the dictionary patcher. (Click to open full view in a separate window)

The core that makes this work is the little javascript object near the bottom (called NVPair) that is connected to a [forward] object. The NVPair implemented an associative array with functionality to insert name/value pairs, retrieve a value given a name, operations to save or restore the array to a file and a function, called dumpall that sends out all name/value pairs.

Let’s go through the steps that occur when you turn a pdial (persistent dial) named AN1x1_AN1xConsole_2_pdial to the value 6

  1. The pdial sends out a list containing two values (AN1x1_AN1xConsole_2_pdial 6) through a [send DHJConsole] object
  2. The dictionary receives this list through the [Receive #1] (half-way down on the left hand side). Remember that that #1 will have been replaced with DHJConsole when the dictionary is actually instantiated.
  3. The text insert is prepended to the list and so the message
    insert AN1x1_AN1xConsole_2_pdial 6
    is sent into the NVPair.js object.
  4. The NVPair object creates an entry called AN1x1_AN1xConsole_2_pdial in an associative array and sets its value to 6. This is known as a name/value pair. Note that if there was already such an name in the associative array, then the previous value would just get updated to the new value, 6.

Steps like these occur whenever any persistent object is modified. The NVPair object also has functions that can save and reload all names and values.

When a patcher containing a dictionary is closed, the contents of the associative array are saved to a file whose name is the same as the patcher (but with a different extension). When a patcher containing a dictionary is opened, the contents are reloaded into the associative array and then the following steps occur automatically after the patcher has finished loading any VSTs (those are the only objects that can take quite a few seconds to load).

  1. The dumpall message is sent to the NVPair
  2. This message causes the NVPair to iterate through the array, sending out every entry as a name/value pair. This is where the fun starts.
  3. For each entry, the value will be sent out first and it gets stored in a temporary variable (the [pv value] object)
  4. The [forward] object is a variant of the [send] object that allows the name to be changed. So the name that will be used is the name that comes out of the NVPair.
  5. Therefore, the value will be sent to any receiver whose name is the same as the name that came from the NVPair.
  6. Each name will exist in one single [receive] object corresponding to the persistent object that was created, as described in the “Creating unique names for user interface objects” section above. That is how each individual user interface element is updated.

In upcoming articles, we will talk about audio mixer, routing and encapsulation.

Categories: Music Technology Tags:

Replace Apple MainStage with a Custom Max/MSP Implementation (Part 3)

September 24th, 2011 2 comments

The earlier articles are

Part 1

Part 2

 


 

I know I was supposed to write about the front panel user interface in part 3 but I got very distracted after deciding it was time to add VST support to my implementation. Up until now, Max has only been involved in MIDI routing to external synths and I was really keen to be able to leverage all the soft synths that I had acquired over time to use with MainStage.

MainStage only supports AudioUnits and Logic Studio proprietary synths and max officially only supports VSTs (although a beta au~ Max object is available here) but the good news is that pretty much all third party softsynths are provided in both AU and VST formats.

Max has an object called [vst~] but it’s quite complicated to configure. For example, take a look at the example from the Max documentation.

I’m not going to explain this here as the point of the article is to encapsulate all that stuff into a more usable form. If you want to know more about the underlying operation, see this documentation. The only thing important to note here is that all (and I mean all) control messages are sent through the first inlet.

My goal was to make it really easy to create “instruments” and use them in song patchers the same way as external MIDI synths are already supported (see my original article).

GenericVST

The key object I created is called GenericVST and it takes one argument, the name of the VST that you want to use. Here’s one that loads the free Crystal VST. By the way, if your VST has spaces in the name, then the entire name should be enclosed in quotes.

This object has 9 inlets and 2 outlets. The two outlets are Audio L and Audio R respectively and can be connected directly to the system audio output (see below) or can be connected to the audio inputs (the last two inlets of the object) of another GenericVST object for effects processing.

Inlets (from left to right)

  1. Patchname
  2. Notes
  3. Aftertouch
  4. Pitchbend
  5. MIDI Channel (defaults to 1)
  6. VST Param (see below)
  7. Quickedit the VST GUI (send a Bang in here)
  8. Audio L in
  9. Audio L out

Here’s an example patcher that uses the above object.

When this patcher loads, the following steps occur:

  1. The VST called Crystal is loaded
  2. A previously stored synth patch (SadStrings) is loaded into the VST

Four inlets are exposed, used to send MIDI notes (note number/velocity pairs), aftertouch, pitchbend, and MIDI expression (CC 1) values into the VST. Understand that these inlets represent how the VST will respond to incoming values. For example, you don’t actually have to send the aftertouch value from your keyboard into this inlet. If you connect a slider on your control surface into the aftertouch inlet, then the slider will cause the VST to respond to whatever effect is associated with aftertouch.

Now, why does the fourth inlet have the extra object (dhj.vst.midi.CC) in the path? Well, remember I mentioned earlier that all messages to a VST are sent in through a single inlet of the [vst~] object. Among other things, that means you can’t just send raw MIDI data into a [vst~] the way you can send them into an object representing an external MIDI synth.

They have to be wrapped into a special message that consists of the symbol midievent followed by the appropriate number of bytes for the desired MIDI event. For MIDI notes, aftertouch, and pitchbend, this has been done inside the GenericVST (as we will see in a future article). However, because there can be many different CC numbers (128 in fact), it’s not practical to create an inlet for each one, particularly since only a couple are ever likely to be used in practice. We will come back to this object later.

GenericVST Presentation Mode

Let’s first look at what happens if you double-click on the GenericVST in a live patch.

We will examine parts of the internals of this object later. The buttons let you quickly edit the VST, save the (possibly edited) settings and reload settings. By default, the textfield containing the patch name is whatever was initially loaded but you can change the name to something else before you save. If you are actually creating a derivative sound as opposed to just modifing the sound for the main patcher, then you will want to copy the instrument patcher to a new file and then change the patchname message inside it to match your new name.

The parameter fields show you what changes when you adjust parameters in the VST editor. This information is used in conjunction with another object called [VSTParam] so that you can easily associate any slider, knob or button on your control surface with any parameter in the VST. We will examine the [VSTParam] object shortly.

If we click the button “Edit the GUI”, the VST’s GUI editor is displayed.

Now, here’s the key concept to understand. Every parameter in a VST is represented by a unique integer value and the value of each parameter ranges from 0.0 to 1.0 and that’s all. This is true whether you are changing the position of a slider, selecting a different value from a drop-down combo box, or adjusting one of the points in a graphic envelope. Anything that changes the actual state of the VST behaves this way. Buttons and menus that just show you a different view or page of the VST have no effect.

As you change parameters, you can see the index and new value in the Max window. Here are some examples.

Moved Voice 1 all the way to the left

Moved Voice 2 all the way to the right

Moved the yellow point to the right

Mapping MIDI CC data to VST values.

MIDI control change values consist of a controller number between 0 and 127 and a value for that controller, also between 0 and 127. A VST with 1800 parameters (say) will have parameter indices from 0 to 1799 and values between 0 and 1. How then do we arrange for a slider on your control surface to control the Voice 1 parameter in the Crystal VST?

The VSTParam object

Lets look at the purpose one one specific inlet for the GenericVST object by hovering the mouse cursor over it.

As you can see, the inlet expects a MIDIEvent or a VSTParam. OK, so now, here’s a patch that lets you control the Voice 1 parameter from the first slider of an Akai surface.

 

The AkaiSurface object encapulates a collection of sliders, knobs and button controls and each outlet sends out a single value between 0 and 127 representing the position of the corresponding control. Note that this is NOT a full MIDI message, just a single value. So how does a single value between 0 and 128 get converted into the desired parameter index and value between 0.0 and 1.0 that is required?

First of all, note that the VSTParam object takes a single argument that will represent the parameter index, which for the Voice 1 parameter is 55. Let’s look inside the instanced VSTParam by double-clicking on it.

The first two parameters of the [scale] object represent the minimum and maximum values of incoming data. The second two parameters represent the minimum and maximum values of outgoing data and actual incoming values get mapped linerarly into the required output value. For example, an incoming value of 64 representing the half-way position of a slider would come out as 0.5 which is half way between 0 and 1. The [sprintf] object then formats the parameter and value into a new message which is sent out (and therefore sent into the VST. Note that the [sprintf] object already has 55 in it. That shows up because of how we created the [VSTParam] in the first place with 55 as its sole argument.

So if you want to control the filter cutoff frequency, resonance, LFO frequency and perhaps the filter decay from an ADSR, then all you need to do is “wiggle” those parameters on the VST, watch what value is displayed in the parameter index, then create a few [VSTParam] objects with the associated parameter indices and connect them up. For example:

To encapsulate this into a standalone synth that exposes just these parameters but can be controlled from different surfaces (say), just replace the inputs to those VSTParams (and the other inputs of the GenericVST itself) with inlets as follows.

I saved this patcher with the name LeadSaw and now I can create a new patcher that just uses this one.

This new instrument can now be used just like any external MIDI instrument by connecting keyboards and surfaces as desired. For example, here is a complete patch that uses my Yamaha AN1x to control this synth where the the modwheel is used to control the envelope decay and my expression pedal controls the filter cutoff frequency, which is the essence of a wah-wah pedal. The first three ports (outlets to inlets) are notes, aftertouch and pitchwheel, which is my standard for all objects representing keyboards.

In part 4, we will disect the GenericVST object itself.

 


The earlier articles are

Part 1

Part 2

Categories: Music Technology Tags:

Why we created Scorecerer – a brief history of our local universe

September 21st, 2011 Comments off

While there have been a few devices around to display music notation, the problem was that it took far too long for me to actually get my sheet music into those devices. As a technologist and a serious amateur musician, I always wanted to use one of those devices and so many years ago with great excitement I bought a MusicPad tablet. However, I found that the process of pulling in my sheet music was painfully slow and I had to use external image processing software such as Photoshop to clean up the scanned images and so forth. Consequently, it took about 5-10 minutes to import each page of sheet music. No way could I afford to spend that amount of time.

So I ended up not really using the tablet for anything serious.

Years later, along came the Kindle DX from Amazon. This thing had the ability to display images and PDF files and I thought, if only I had an easy way to get my sheet music into it efficiently, this could be really cool. So along with my partners, we created Deskew Technologies, LLC. By the way, the word “deskew” (pronounced dee-skew) means to straighten. Amazingly, the domain deskew.com was not taken so we grabbed it.

We build the first version of the desktop application, which allowed you to just drag scanned images into it. It didn’t care too much about the resolution, gray-scale, color or other “stuff” that one normally worries about with scanners. Nor did you have to worry too much about getting the paper perfectly aligned or straight. Just thow the page on the scanner and save it on your computer. It also understands most popular image file formats so you can just use whatever format for which your scanner is currently configured. No need to make sure it’s suitable for Scorecerer. Scorecerer would then straighten the music automatically as well as remove empty borders so as to maximize the available screen space. It could then “publish” the music directly to the Kindle DX optimized to display nicely and as fast as possible. All of these operations were designed so that you didn’t have to spend any time “futzing” to get it right. So with Scorecerer, the 5-10 minutes became 5-10 seconds, or at worst just slightly longer than your scanning speed (because it takes a second or two of your time to actually drag the scanned images into Scorecerer Desktop!)

It worked very well but the speed at which the Kindle DX could turn the page was not-so-great so although it was usable, it wasn’t wonderful.

We also built Scorecerer Player, an application designed just for displaying Scorecerer processed sheet music on a laptop. We figured that Windows-based tablets would arrive and perhaps even a Mac OS-X based tablet was on the cards. Scorecerer Player is still included as part of the Scorecerer package and it also supports remote MIDI so you can turn the page by pressing a button or a pedal on your MIDI rig.

Then we heard that the iPad was coming. We signed on to the Apple Developer Program, got the iPad SDK and built a “Player” for it. Scorecerer Desktop could now publish in a format optimized for the iPad and boy, were we happy with the page turning speed. We submitted Scorecerer Ipad to the app store in time for the launch.

We added wifi-based publishing recently so you don’t have to depend on iTunes to transfer sheet music. This is particularly worthwhile for those people who have hundreds or thousands of songs. There’s a quick search built into Scorecerer iPad so you can quickly find the song you need.

We’re still adding new features, based on feedback we’re getting from users. Feel free to contact us through our support center with your own requests.

Categories: Musings Tags:

Easy solution to installing Python Imaging Library on a Macintosh

August 10th, 2011 Comments off

To make it easier for our customers to extract images out of PDF files without using our cloud-based PDF conversion process, I decided to investigate the use of the Python Image Library to build a utility program.

After downloading the source code, I tried to build it and soon ran into errors such as the following:

lipo: can't open input file: /var/folders/.........(No such file or directory)
error: command 'gcc' failed with exit status 1

or

lipo: can't open input file: /var/tmp//ccptvQXL.out (No such file or directory)

Turns out this was happening because my system doesn’t have the ppc binaries (nor do I want them) and one of the options being passed to the gcc compiler is

 -arch ppc

I looked around on the web and found a few pages that described this error.

E.g.,

http://www.p16blog.com/p16/2008/05/appengine-installing-pil-on-os-x-1053.html
and
http://passingcuriosity.com/2009/installing-pil-on-mac-os-x-leopard/ and
http://www.kelvinwong.ca/tag/python-imaging-library/

However, the suggested solutions were all rather painful or awkward. I decided to trace into the setup.py file to see from where it was getting the compiler options. To cut a long story short, I discovered very quickly that there’s an environment variable called ARCHFLAGS and if that is defined, then it is used before other mechanisms.

So I simply typed the following into the terminal window

export ARCHTYPES="-arch i386 -arch x86_64"
python setup.py install

and everything just built and installed with no problem whatsoever.

If you do this kind of thing often, it might be worth setting the environment value in your .bash_rc or .profile

Categories: Software Development Tags:

Replace Apple MainStage with a custom Max/MSP implementation (Part 2)

July 26th, 2011 Comments off

In part 1 of this article, I introduced some of the basic Max patchers I created such as generic MIDI input and output devices and then instrument-specific versions. After a few days of testing, I ran into some bizarre behavior where Scorecerer would not change pages via MIDI remote control, but only for some songs.

I finally tracked the problem down and it turned out that there was a ‘double-bug’, one in Scorecerer and one in my Max implementation. These conspired to cause the problem. What was happening was that when I pressed the button on my MPK61 to go to the next page in Scorecerer, Scorecerer didn’t respond at all but the (highly recommended) MIDI Monitor app I was running reported that two copies of the CC event were being sent to Scorecerer. I’ll get back to the reason two copies were being sent in a moment as that caused me to change the original design of the MIDI device patches.

The reason Scorecerer didn’t respond at all (as opposed to changing the page twice, which is what should have happened) is because I was (stupidly) using the length of the incoming packet to decide whether I was getting a CC event (3 bytes) or a ProgramChange event (2 bytes). However, in this particular case, the packet length was 6 (two CC events) and so was just being ignored. I fixed this by switching on the actual Status byte but I’m glad I found this issue as if it had been reproduced by customers but not by us, it would have been very difficult to track down.

Let’s get back to the reason Max was (sometimes) sending the event twice. Max has a very nice feature where, rather than having to use a single instance of a particular object, you can instantiate it multiple times and place it conveniently. For example, suppose you want an incoming note to be played on two synthesizers, with one of them playing the note transposed up by 7 semitones. You could use the following simple patch.

This will work absolutely fine but if you decide that you want to perform more sophisticated operations along the way, the view will get complicated due to too much entanglement. A better solution is to use the following patch.

In this version, even though each noteout object refers to different output devices (not shown here), both notein objects can refer to the same input device. As you will see when I show a patcher for a complete song further down, this separation can make it much easier to view and understand the patcher.

Now, if you recall from part 1, I had defined a generic MIDI In patcher and then created keyboard specific devices with it. In particular, those devices had some explicit outlets (representing notes, pitchbend and aftertouch) and used multiple send objects to make incoming CC events with different numbers available easily to multiple receivers. Therein lies the problem. Take a look at the following patcher, representing the input of my Yamaha AN1x.

As you can see, it has some outlets for notes, pitchbend, aftertouch and it also has a collection of send objects used to send out data streams for the knobs and pedals. Now, consider what happens if you create two or more of these patchers. There’s no impact on wired connections (if you don’t connect anything to the outlets, nothing happens) but anytime a CC event occurs, it will be sent out as many times as there are patcher instances. Some of my song patchers had the instrument device instantiated more than once (for the reasons described earlier above) and that’s what led to the problem.

After finally figuring out what happened, I changed the design slightly to address this problem. The first thing I did was get rid of the explicit outlets completely and replaced them with send objects. So the AN1x input patcher now looks as follows:

Now, every item of interest uses the send/receive mechanism. The second design change was to create a patcher that contained a single instance of all my input devices. This is called PhysicalInputDevices. Here it is.

As you can see, this patcher contains one object for each MIDI input device of interest. A  single instance of this patcher is created after which point all outputs of interest from all of these devices are immediately available wherever they are needed by just inserting a receive object into any patcher and using the desired argument name.

Songs

Now that the basic construction has been explained, lets take a look at an actual song patcher. Although I’ll include an entire song patcher here just so you can see the whole thing, I’ll then just focus on pieces of it at a time to explain how it works. A song patcher represents the same functionality as an individual MainStage patch and so there are multiple song patchers, each representing (typically) the required functionality for specific songs. Here is a patcher that I use with my band to perform the song All I Need is a Miracle (Mike and the Mechanics). Take a few moments to just look around the patcher.

Phew! OK — let’s look at this thing piecemeal and all will become clear.

The loadbang object is triggered automatically when a Max patcher is opened. The loadbang is connected to a send object with an argument called BigBang and a print object. The latter is just used for debugging and writes information to a console window. Various receive objects both within this song patcher as well as in some other auxilliary patches (such as the FrontPanelDisplay which provides a GUI similar to a MainStage layout showing what knobs and sliders are active) will be triggered as a consequence, allowing various initialization to occur as we will see shortly.

Sending program changes to external devices

Every song patcher contains a SystemProgramChange patcher to make it easy to send program changes to all devices in one go.

Here, we have six messages, each of which consists of a single number. Those numbers happen to be the MIDI Program Change values for my external devices for the Miracle song. The SystemProgramChange patcher encapsulates my external devices and exposes just the ProgramChange parameter of each of those external devices. Here’s what it looks like under the covers.

Note that because some synthesizers use 0 for their first patch and others use 1, I’ve enabled some of the devices to subtract 1 from the incoming value. That allows me to use the same values as displayed by the devices regardless of whether they start from 0 or from 1.

 

From input to output – our first sound

Lets now take a look at one of the playblocks (my term for a collection of objects that actually let you play one of your keyboards and get some sounds out of it).

This is the closest to a combination of a MainStage external MIDI channel strip coupled with the MIDI input section inspector. This is where you can define a routing from a keyboard (MIDI input device) to a synth (MIDI output device). Here we are using the AN1x to control channel 4 of the Receptor which has received a program change (see above) so that it produces a “pad” sound. On the right, we have connected Knob1 to the Volume inlet of the Receptor channel. Next to it, we have connected the pitchbend wheel of the AN1x to the pitchbend inlet of the Receptor (MIDI channel 4). Unlike in MainStage where some connections happen automatically unless you explicitly block them (great for simple setups but not so good for anything complex), in this Max environment, if you don’t make the connection, you don’t get any result.

The chunk on the left is a little more interesting. You can see that we are receiving incoming notes from the AN1x, but they are not going directly to the Receptor. Instead, they are going through another custom object called KeyboardLayer. Let’s look at that object.

All the objects here are built-in Max objects. Please refer to the Max documentation for deep details of what’s going on here. The left parameter takes a list of two values representing a note (the note number and the velocity value). The right parameter takes a list that represents the lowest allowed key, the highest allowed key and a transpose amount). The unpack object breaks up an incoming list into separate values, and the optional arguments define the type of those values. The default is two integers. So the note number and velocity are extracted from the incoming note. We don’t care about the velocity value and so it simply gets passed on to the pack object where (along with an eventually arriving note number) it will be combined back into a note and sent out.

The split object determines what range of values are allowed through its left outlet. The default (as defined by the arguments) is to allow all MIDI note numbers through. However, if you send it min and max values to the two right outlets, then it will only allow notes through that are within that min and max range. Any note number that gets through is added to a transpose value if provided.

Referring back to that playblock, it should now be clear that all notes on the AN1x are sent to the Receptor but transposed up by 12 semitones. The BigBang message, triggered by the loadbang when the song is loaded, is used to initialize the desired range and transpose for the specific KeyboardLayer object.

Splits and layers

If you look at the playblocks for the Polymoog and Piano in the song patcher above, you can see that separate sections of the Akai keyboard are being used for each of those sounds, i.e, MIDI notes 36 to 53 inclusive are sent to a piano sound (transposed up one octave) and MIDI notes 84 to 96 inclusive are sent to the Polymoog sound. By adjusting the ranges, you can configure any desired splits and/or layers that you need.

In Part 3, we will introduce a simple mechanism to implement velocity mapping and also talk about the consolidated front panel through which one can see quickly what knobs and sliders are associated with what parameters for the various devices.

Categories: Uncategorized Tags: