Here are the numbers for all defined MIDI controllers.
The first 32 numbers (0 to 31) are coarse adjustments for various parameters. The numbers from 32 to 63 are their respective fine adjustments (ie, add 32 to the coarse adjust number to get the fine adjust number, except for the General Purpose Sliders which have no fine adjust equivalents). For example, the coarse adjustment for Channel Volume is controller number 7. The fine adjustment for Channel Volume is controller number 39 (7+32). Many devices use only the coarse adjustments, and ignore the fine adjustments.
0 Bank Select (coarse)
1 Modulation Wheel (coarse)
2 Breath controller (coarse)
4 Foot Pedal (coarse)
5 Portamento Time (coarse)
6 Data Entry (coarse)
7 Volume (coarse)
8 Balance (coarse)
10 Pan position (coarse)
11 Expression (coarse)
12 Effect Control 1 (coarse)
13 Effect Control 2 (coarse)
16 General Purpose Slider 1
17 General Purpose Slider 2
18 General Purpose Slider 3
19 General Purpose Slider 4
32 Bank Select (fine)
33 Modulation Wheel (fine)
34 Breath controller (fine)
36 Foot Pedal (fine)
37 Portamento Time (fine)
38 Data Entry (fine)
39 Volume (fine)
40 Balance (fine)
42 Pan position (fine)
43 Expression (fine)
44 Effect Control 1 (fine)
45 Effect Control 2 (fine)
64 Hold Pedal (on/off)
65 Portamento (on/off)
66 Sustenuto Pedal (on/off)
67 Soft Pedal (on/off)
68 Legato Pedal (on/off)
69 Hold 2 Pedal (on/off)
70 Sound Variation
71 Sound Timbre
72 Sound Release Time
73 Sound Attack Time
74 Sound Brightness
75 Sound Control 6
76 Sound Control 7
77 Sound Control 8
78 Sound Control 9
79 Sound Control 10
80 General Purpose Button 1 (on/off)
81 General Purpose Button 2 (on/off)
82 General Purpose Button 3 (on/off)
83 General Purpose Button 4 (on/off)
91 Effects Level
92 Tremulo Level
93 Chorus Level
94 Celeste Level
95 Phaser Level
96 Data Button increment
97 Data Button decrement
98 Non-registered Parameter (fine)
99 Non-registered Parameter (coarse)
100 Registered Parameter (fine)
101 Registered Parameter (coarse)
120 All Sound Off
121 All Controllers Off
122 Local Keyboard (on/off)
123 All Notes Off
124 Omni Mode Off
125 Omni Mode On
126 Mono Operation
127 Poly Operation
MIDI controllers which are hardware and software
The following are classes of MIDI controller:
The human interface component of a traditional instrument redesigned as a MIDI input device. The most common type of device in this class is the keyboard controller. Such a device provides a musical keyboard and perhaps other actuators (pitch bend and modulation wheels, for example) but produces no sound on its own. It is intended only to drive other MIDI devices. Percussion controllers such as the Roland Octapad fall into this class, as do guitar-like controllers such as the SynthAxe and a variety of wind controllers.
Electronic musical instruments, including synthesizers, samplers, drum machines, and electronic drums, which are used to perform music in real time and are inherently able to transmit a MIDI data stream of the performance.
Pitch-to-MIDI converters including guitar/synthesizers analyze a pitch and convert it into a MIDI signal. There are several devices which do this for the human voice and for monophonic instruments such as flutes, for example.
Traditional instruments such as drums, pianos, and accordions which are outfitted with sensors and a computer which accepts input from the sensors and transmits real-time performance information as MIDI data.
Sequencers, which store and retrieve MIDI data and send the data to MIDI enabled instruments in order to reproduce a performance.
MIDI Machine Control (MMC) devices such as recording equipment, which transmit messages to aid in the synchronization of MIDI-enabled devices. For example, a recorder may have a feature to index a recording by measure and beat. The sequencer that it controls would stay synchronized with it as the recorder’s transport controls are pushed and corresponding MIDI messages transmitted.
MIDI controllers in the data stream
Modifiers such as modulation wheels, pitch bend wheels, sustain pedals, pitch sliders, buttons, knobs, faders, switches, ribbon controllers, etc., alter an instrument’s state of operation, and thus can be used to modify sounds or other parameters of music performance in real time via MIDI connections. The 128 virtual MIDI controllers and their electronic messages connect the actual buttons, knobs, wheels, sliders, etc. with their intended actions within the receiving device.
Some guitar controllers, such as pitch bend, are special. Whereas the data range of most continuous controllers (such as volume, for example) consists of 128 steps ranging in value from 0 to 127, pitch bend data may be encoded with over 16,000 data steps. This produces the illusion of a continuously sliding pitch, as in a violin’s portamento, rather than a series of zippered steps such as a guitarist sliding his finger up the frets of his guitar’s neck. Thus, the pitch wheel on a MIDI keyboard may generate large amounts of data which can lead to a slowdown of data throughput. Many sequencers can «thin» pitch-bend or other continuous controller data by keeping only a set number of messages per second or keeping only messages that change the controller by at least a certain amount.
The original MIDI spec included 128 virtual controller numbers for real time modifications to live instruments or their audio. MIDI Show Control (MSC) and MIDI Machine Control (MMC) are two separate extensions of the original MIDI spec, expanding the MIDI protocol to become far more than its original intent.
Over the course of the past few years, I’ve been attempting to help a guitarist rig up a midi studio with devices that would be controlled by a midi guitar. A midi guitar is simply a guitar fitted with electronic circuits which convert the guitarist’s picking and fretting movements into appropriate MIDI messages.
In a «normal» guitar, the «sound» of the strings is detected by magnetic pickups, and this electrical signal is amplified to the point that humans can readily hear it (usually well into the next city). Now, this analog signal is just fine for feeding into an amplifier, but of course, MIDI is a digital signal. You can’t send the output of those pickups over a MIDI cable and expect a MIDI device to respond, just as you can’t take the audio output of an electronic keyboard and run it into another unit’s MIDI jack. In an electronic keyboard, when a musician depresses a key, this is detected and immediately converted to a MIDI message. This is relatively easy to do because each key makes its own «contact» with the circuitry that determines which note is being held. There is only one middle C on a keyboard. The rest of the keys are other notes. Contrast this with a guitar. Certain notes can be played in several different «places» on a guitar (ie, different frets on different strings). It’s a lot more obtuse. Nevertheless, it’s possible to put a pickup on every single fret of a guitar so that some electronic circuit can determine exactly what fret the musician is holding. Then, some circuit would need to detect which of the six strings are vibrating so it can be determined which notes on that fret are actually being played. This would have proven to be a reliable and efficient method of producing MIDI codes. So, is this what the music manufacturers did? No. Most decided to take a much cheaper, but infinitely inferior approach.
The inferior method that the manufacturers chose involved «pitch to voltage conversion». Quite simply, instead of having the circuitry analyse what the musician was doing with his fretting and picking directly from the frets and strings, the circuitry got this information second-hand from the output of the magnetic pickups. Unlike on an electronic keyboard, where physically pressing the key generates the midi code, on a midi guitar, the circuitry has to analyze the «guitar sound» (ie, the pitch of the note being played) in order to generate the midi code for it. It takes a long time to do this (the circuitry has to analyze at least one entire cycle of the waveform). It’s long enough that the human ear hears a delay between picking the note, and hearing the results of the generated midi code. This is very disconcerting. It’s the primary reason that midi guitars have been almost universally shunned by guitarists. The manufacturers killed their own potential market with a bad design flaw, but this was probably destined to be because it’s questionable as to whether guitarists would have supported a more expensive and less familiar, yet better, idea. (We keyboard players DID support technical gizmos in the mid-seventies, some of them rather unreliable, with the net result that we eventually got the best that could be offered. Nowadays, if you want to get involved in electronic music, you have to have some keyboarding skills. NOTE: A few esoteric and EXPENSIVE midi guitar systems were developed that didn’t rely on pitch to voltage conversion; notably the SynthAxe. Again, these systems weren’t purchased in enough quantity to warrant further development and price reduction).
When a guitarist plays, sometimes unintended vibrations of a string occur (ie, a string next to the one that is picked also vibrates due to physically hitting it with the pick or hand, or through vibrations of the neck or adjacent strings, or the string will continue to vibrate some even after the finger is removed from the fret). This really isn’t too distracting when you listen to the audio output of the guitar. Usually, you’ll hear a very brief and/or soft pitch that is masked by the intended pitches. It’s part of the character of a guitar performance, although excessive amounts of unintended string vibration usually result in a guitarist’s performance being described as «sloppy» (or «heavy metal» if you’re a teenager). Another disadvantage of the pitch to voltage is that these unintended pitches ARE converted to midi notes. It would take major processing power to harmonically analyze the output of the pickups and figure out whether a note might be intentional, or unintentional. (NOTE: These unintentional midi codes are often referred to as «glitches»). The delay of converting GOOD pitches to voltages is long enough. Further processing of the pickups would just delay things to utterly unacceptable lengths. So, midi guitars don’t bother doing this. The net result is that unless you have AN INCREDIBLY CLEAN PLAYING STYLE (ie, you can control the pick and your hand from touching unintended strings, and you don’t pick or fret so hard that you cause vibrations), then you are going to generate lots of midi glitches.
So, how do you adapt your playing style to render a clean midi output? Learn to play a keyboard. Seriously. But if you don’t want to throw away your guitar technique, then you may have to modify it to suit the «pitch to voltage» midi guitars. First, in analyzing Mike’s glitches, I noticed that the most common cause was due to string vibrations when he removed his fingers from the frets. Either the open string would vibrate (softly, but indefinitely until he touched the string again), or the string would produce a brief «retriggering» of the note that he just fretted (i.e. would play the same fretted pitch as a stacatto note) when he released it. Most of these glitches can be eliminated by NOT REMOVING YOUR FINGERS FROM THE STRINGS WHEN YOU RELEASE THE NOTES. In other words, you allow the string to raise off of the fret, but you keep your fingertips touching the strings. This will mute any string vibrations as you release the strings.
Another source of glitches is by hitting unintended strings with your hand or the pick. Clean up your playing style. Develop a «light» picking style. Mike was one of those guys who was taught to give the string a hard «twang» with the pick, and he inevitably would cause the guitar neck to vibrate, or brush the pick against an adjacent string.
One last thing to consider is the delay between picking and when the midi note is generated. You have to learn to anticipate each note (ie, play each a fraction of a second before the beat). The net result is that the midi note will be generated ON the beat, where it should be. I remember reading a Robert Fripp interview in which he mentioned adapting his playing style to do this. Fripp is the only guitarist who has a clean enough style such that he can play a «pitch to voltage» midi guitar live without numerous glitches. (Holdsworth played the SynthAxe, which is an entirely different triggering system).
By adopting the preceding guidelines, you can eliminate most glitches via your playing style.
Now, if you don’t have the patience or desire to «learn» to play a midi guitar without glitches, and you don’t intend to play it live (ie, you just want to use it for sequencing), then there are certain features that you’ll want to look for in sequencer software. These features concern «filtering» midi data (ie, removing only events with certain characteristics). The idea is that Mike would play his part as best he could, with glitches, and then use the sequencer’s filtering routines to weed out the glitches from his recorded data. Mike is using a sequencer program that I myself wrote, and in the course of helping him overcome his glitch problems, I’ve discovered certain things about the midi output of pitch-to-voltage guitars, as well as what filtering algorithms work well to remove glitches.
First, in order to eliminate the recorded delays caused by pitch to voltage conversion, it’s necessary to quantise the data. (See my article on sequencing for an explanation of quantising). Now, I don’t like robotic, computer perfect performances much. If you have a sequencer with high resolution (ie, > 240 ppqn), I recommend using quantising routines that correct only the most severe timing deviations, while leaving in some human nuances. Not only did this eliminate the delays, it also got rid of the «strumming» effect that you get from a pick (ie, yielded what you’d get if you finger-picked the strings) which is desirable if you’re sending the midi data to something like a piano/strings/organ/synth etc patch. Keyboard players don’t strum chords. So, look for a sequencer program that has quantise features, but to avoid robotic music, use one that offers flexible quantising.
Next, I noticed that there were 2 common things about the midi data of a glitch. Sometimes only one of these 2 things were present; sometimes both. In any event, most glitches exhibited these characteristics while most «good» notes didn’t. Some of the glitches were short, stacatto notes. Most likely, these were those glitches caused by retriggering of notes as he released strings from the frets. When I looked at the duration of a glitch (ie, the # of clock pulses between its note ON and its subsequent note OFF), it was usually only a few clocks. These glitches sounded like short bursts of sound, «blips», as Mike described them. (Hey, he’s a guitarist, so that’s the best that he can do). Normally, no good note would be held for this short a time. At 240 ppqn, a duration of a few clock pulses means that the note was something on the order of a 128th note. Not even Alan Holdsworth would tackle that. So, Mike applied a sequencer feature that allowed him to filter out all midi events that were, let’s say, 5 or less clock pulses in duration. This eliminated a huge amount of his glitches.
But, there still were more glitches. These glitches weren’t «blips». They weren’t stacatto. They sounded for a long time. Mike called these «screeches» (probably inspired by what he would do everytime that the sequencer program would crash before he could save his data). Most likely, these were glitches caused by open strings ringing as he picked or released a chord. One thing that I noticed is that the velocity of these notes were all rather low. The glitches were quieter (but still noticeable) than the good notes. So, Mike applied a sequencer feature that allowed him to filter out all midi notes with a velocity of less than, let’s say, 31. It just so happens that his Yamaha (his preferred guitar), transmits midi notes with a limited set of preset velocities (ie, doesn’t produce the full range of 0 to 127, but rather velocities of 0, 7, 31, 64, etc). Mike struck his good notes hard, so their velocities were 120, but sympathetic vibrations caused midi notes with much lower velocities, typically the 7 or 31 values. This filtered out virtually all of the remaining glitches.
Look for a sequencer program that allows you to filter notes that are of very low velocities, and also filter notes that are of very short durations.
He could then easily delete (via editing the data) the remaining few glitches that managed to not have either of the two, telltale characteristics.
One drawback that we found, was that occasionally, some of his good notes would also have low velocities. Maybe he didn’t pick those strings quite as hard as he should have. When he filtered by velocity, he might rip out some good notes, with the net result that certain chords would sound «empty» (ie, missing intended notes). So, an important feature is that Mike can SAVE his filtered events to a separate sequencer track to check for good notes before he deletes that data. A lot of programs just throw the filtered stuff away. To make it even easier to «recover» those good notes from the velocity filtered track, you could create a CakeWalk CAL file (ie, macro) whereby you enter which chords fall upon which beats (ie, a fake sheet of your song, if you will) and let the sequencer remove only those notes that truly don’t belong in your chord changes. You can then remerge the good notes in the velocity filtered track with your «good» track.
So, although the part that Mike played was very ugly; full of glitches, he could invoke these filtering options on his recorded data, and within minutes end up with a part without glitches.
Of course, there was one last thing. Since the Yamaha had preset velocity values for its midi notes, all of his good notes came out with the same velocity of 122 or 127. This sounded very artifical. If I played the same part on a midi keyboard, there would no doubt be more variety in the velocity values. So, a randomizing feature for note velocity is useful to add variety.
When shopping for a sequencer program to be used with a midi guitar, these are some features that you might want to look for. Some programs don’t filter by duration, some don’t filter by velocity, some don’t save filtered data so that you can further edit and filter it, some don’t have «Human» quantising features or intelligent randomizing functions. Some don’t have a scripting language to create a routine to weed out notes that don’t «belong» in the chord changes.
Various patch editors and librarians are also available for pcs. These programs allow the user to edit sounds away from the synthesizer and often in a much friendlier environment than what the synthesizer interface offers. The more advanced librarians permit groups or banks of sounds to be edited, stored on disk, or moved back and forth from the synthesizer’s reminiscence. They also allow for rearranging sounds within banks or groups of banks for customized libraries. These programs are generally small and can be incorporated into some sequencing packages for ease of employ. On the other plam, each synthesizer requires a different editor/librarian since internal data formats are unique for each. Some packages offer editor groups for a specific manufacturer’s border as some of the internal data structure may be corresponding between the units. However, there is not yet a universal librarian that covers all makes and models of sound modules; it would just be as well large.