max

Introductions, part 1

Well, there's just one introduction. It's the most unified section of the piece. It will actually have a standard score with a time signature, bar lines, and synchronized parts. I knew the kind of texture I wanted; it came out of an improvisation with Impulse back before the holidays. The primary gesture is based on the physicality of playing the piano in the low register, thumbs together, alternating hands playing "random" notes within a generally fixed range in a fast, regular pattern. The pitches aren't important except in that they should not overly emphasize any particular pitch. Of course, as we know from the history of serial music, it requires some kind of non-intuitive system to make an even distribution of pitches sound just right. In my improvisation a couple months ago I felt like I was getting the right texture intuitively, but when it comes to making decisions about pitches to go down on paper I felt I needed to go to the computer to generate a texture closer to my improvisation. Besides, while my improvisation

felt

 right, without a recording I can't be objective enough about it (not to mention I can't transcribe what I played). Intuition is a dangerous place to spend too much time :)

I went to Max because it's very flexible. I built a very rudimentary patch that outputs MIDI information directly to Sibelius. Here's a picture:

The toggle in the upper left turns on the patch. The metro object bangs the toggle below resulting in alternating 0s and 1s. The 0s go to the "left hand" side of the patch, which generates the left hand notes, and the 1s go to the "right hand" side. Both sides are essentially parallel, generating numbers between 48 (the MIDI number for C3) and 48+6 (or 54, F-sharp3). The left hand side is then lowered 7 semitones, producing the range of F2 to B2. The result is alternating left hand and right hand notes, each hand covering the range of a tritone, which fits very comfortably under the hand.

Once I decided the length of the gestures, I added the objects on the right side of the patch to add a curve to the gesture (more on the length of the gestures below). The center pitch above is MIDI note 48 (C3), or 21 semitones above E-flat1, a tritone above the lowest note on the keyboard (i.e. as low as possible without the left hand running off the keyboard). During my original improvisation I moved gradually to the bottom of the keyboard, and I wanted to recreate that gesture here. I tried to descend by semitone every measure for 21 measures, but found the descent was too regular for my liking. By connecting the

itable

object to the transposition factor (see figure above), I could control the rate of descent. I simply drew the curve that I wanted with my mouse (of course, I had to set the parameter of the

itable

first--in the example above I knew I needed 377 notes, so the x-axis was set to 377). The transposition factor adjusts the center pitch, which is 48 by default, thereby lowering all the pitches proportionally. When the curve reaches the bottom of the

itable

, the transposition is 21 semitones down, for the bottom of the keyboard.

The form of the introduction

The introduction is around 2:20 in length, but it gradually dissipates into the main body of the piece making the ending of this section ambiguous. It represents no more than 10% of the entire work, and probably a little less. I first thought of it as a stand-alone, unrelated section, but now I think of it as crucial to the development of the three component pieces: In the beginning the three are integrated into one gesture, but during the course of this introduction, they begin to foreshadow their distinctive behaviors and come apart from one another. If the idea for the entire work is three separate pieces, the introduction tells the story of how they became separate.

The first 30 seconds or so is an extended reproduction of the improvisatory piano gesture I described above. Percussion and the electric guitar join in unison or octaves, dynamically coloring the piano's timbre. This is notated by 377 sixteenth notes. After one sixteenth rest, the same gesture is played again, but shorter this time--233 sixteenth notes. Then another sixteenth rest precedes a third gesture taking 144 sixteenth notes. There are twelve gestures like this, each getting shorter according to the Fibonacci series down to a one-sixteenth-note gesture. The rests between each gesture (#thevoid) get progressive longer according to the same series.

These rests between each piano/percussion/guitar gesture are filled in by harmonic series chords in the winds, strings, and sopranos. Conceptually, I just wanted static surface texture to contrast with the active sixteenth-note surface. However, as the piano's active texture is colored by the percussion and electric guitar, the static-texture interruption is also elaborated somewhat. The primary static material is found initially in the bassoon and clarinet (though these may change later in the introduction--it's not finished yet). The first static gesture is only one sixteenth note, so in order to avoid it blending too much into the piano/percussion/guitar texture, I orchestrated the event with some higher-frequency resonance. This resonance is found in the flute and string harmonics, and it is sustained somewhat longer than the single sixteenth note played by the bassoon and clarinet. The singers, too, project this idea of resonance with even longer (approximately two measures) passages of unisons and close-voiced harmonies that slowly change.

As the static gestures get longer they come to dominate the surface of the music. From a position of practicality the resonances must either get shorter (because the time between gestures is getting shorter) or begin to wash over the beginning of the next gesture. I will play with this, probably alternating between abrupt changes with no resonance and resonances that become asynchronous with the static event rhythm (think of waves crashing irregularly on a beach). The nature of the soprano parts as harmonically dynamic resonances will begin to change to more static material that will eventually lose prominence to the strings and winds, which will gradually become more active. The sopranos' movement toward stasis will foreshadow the beginning of sopranos' large-scale gesture, which begins quite statically. The winds/strings' growing prominence will signal, by the end of the introduction, the beginning of isorhythmic texture that will dominate those instruments' large-scale gesture. The piano/percussion/guitar part, with its curves in pitch space, foreshadows the tempo curves that will dominate the behavior of those instruments later.

Tempo maps

This is going to get kind of technical, so feel free to skip. I just need to keep track of what I was thinking.

I've decided to begin in the middle. The big wave in the background is a conceptual design for the piece--no specifics right now except that it's related to yin and yang--two opposed but complementary forces/sections of the piece. The rectangle in the middle is the beginning of the second force/section. It will rise gradually out of the first.

Inside the rectangle are six tempo maps that cover about five minutes of the transition. Each map represents a pulse that changes gradually over time according to various curve functions. They progress from fast to slow (bottom to top). The possible ratios are listed in the upper right corner. 3:2 is the base, they progress through a wave in ~75 and ~50 seconds, respectively.

The various kinds of curves are represented in the upper center portion. Only

accelerando

 curves are shown, but

ritardando

 curves would be the same shape but inverted.

The lower left corner contains partial contents of Max messages for the six tempo maps (more on that below).

This is the basic Max patch I'm using to output clicks of each tempo map. My current plan is to output around five minutes of (metronome) clicks for each of six maps, then layer them in Logic. This is mostly to give me a visual sense of the ebb and flow of the various tempo streams. I'm considering the possibility of using Logic to output these streams to headphones worn by six performers in the performance. I'm generally opposed to headphones with click tracks, and if I can simulate the various interactions between tempo streams through another means, I won't, but for now it's an option.

This patch receives a list of three floats: the target tempo, how long (in milliseconds) it will take to get there, and the shape of the curve (i.e. the shape of change in tempo).

This patch is a test environment for sending messages to the patch above. Each stream has a patch "actualcurve" and a level slider. (The patcher impulse_design is where the user can draw the timbre for the click on each stream.) In the lower left corner there is a bit of coding that stores a list of all the targets, times, and curves in a coll object, and outputs the next in line after each target tempo is reached. 

More to flesh out. Again, this is primarily to help me remember what I was thinking next spring when I have to write a paper.

©2017 Joshua Harris