Tuesday, 26 August 2014

Piano technique in the works of J.S. Bach

I've been thinking about this for a while and finally decided it was worth writing about...

The technique required to play Bach on the piano is completely different to the technique required to play, say, Chopin or Rachmaninov. There are some obvious reasons for this: pianos as we know them didn't exist in Bach's time (even during the classical period they were relatively new and very different to modern pianos) and fingerings were very different to what they are today - for example, the thumb was hardly ever used!

I consider this last point to be all-important. It goes a long way towards explaining why even the most technically proficient pianists often still struggle with early repertoire, despite being able to breeze through far more complex 19th and 20th century pieces without any difficulty. It probably also explains my own technical battle with Bach's music.

I tend to think of the development of piano technique as we know it today starting with Chopin; however, there were still composers writing for the piano in a relatively conservative style during Chopin's lifetime (during the first half of the 19th century). Chopin was innovative in many respects, and his 24 etudes in particular bear testimony to this, counting among the many works which are especially suited to playing on a modern piano, with more or less 'modern piano technique'. Such pieces are commonly referred to as highly pianistic. A composer I consider to have attained the pinnacle of 'pianisticness' (no, that's not a word, I just made it up) is Rachmaninov, and any reasonably proficient pianist who has played his music will understand why.

There are ingrained fingerings that modern pianists tend to fall back on when sight-reading or playing a piece they haven't learnt intensively. These are the fingerings that occur most frequently in romantic-era piano music, and which therefore become automatic after someone has been studying the piano for long enough. Even when the pianist has memorised an entire sonata or concerto they are still likely to be falling back on these 'defaults' a lot of the time, without even realising. This is especially true in highly pianistic music, where the expected 'defaults' are rarely broken.

Fingering defaults are perfectly adequate for a lot of piano music written in the 19th century, and for simpler music of other eras. However, they begin to fail when the music doesn't fit into the common spacial patterns the defaults were designed for. A lot of baroque and classical music doesn't fit in; neither does a lot of 20th century music. In fact, there's a relatively narrow range of music in which the pianist can rely on these generic fingerings to get by. Extra work is required to be able to play pieces that fall outside the range of fingering defaults convincingly.

Since an enourmous amount of work is required to get a piece to performance standard anyway, this shouldn't be a problem. However, I've discovered that for me, at least, there are some issues with memorising pieces in which I can't rely on fingering defaults.

Some really unusual (by today's standards) fingerings are required to play Bach's music on the piano. In my score of his D minor concerto, almost every note is fingered. It's absolutely necessary; I would get totally lost without having meticulously figured out the fingering for every passage. Standard fingerings are useless in Bach's music, and even more so in his highly contrapuntal works (i.e. fugues) than in the relatively simple, mostly 2-part texture of the D minor concerto.

I've always learned new pieces of Bach using the 'metronome increment' technique: perfect a passage at a certain BPM, increment the metronome by a small amount, perfect the passage at the new speed, and so on until I can play it perfectly at performance speed. This amount of repetition, combined with the sheer oddness of the fingerings required, have always enabled me to memorise Bach very quickly. However, until I noticed how quickly I forgot pieces of Bach that I thought I'd memorised to perfection, I didn't realise how much I fall back on fingering defaults when playing from memory.

Fingering defaults act like hints when playing from memory.  If you're not quite sure what comes next, well, chances are your fingers already know the pattern and will take care of it for you. That is, if the fingering required to play the passage matches those generic patterns. In Bach this isn't the case; the fingering patterns are unexpected, almost unnatural to a modern pianist. You can no longer depend on generic fingerings - they simply won't work.

Perhaps learning historic fingerings that keyboardists actually used in, say, the baroque era would enable one to develop a new set of fingering defaults that make playing music of that era much easier. This is something I'm very curious about. However, it's important to keep in mind that historical fingerings were not destined for the piano, or at least the piano as we know it today, and different fingerings are probably necessary to play early music on today's pianos - which have far heavier keys - without developing RSI.

Mac reminiscences

note: this was a post I wrote about 6 months ago and never published.
Ever since deciding to pursue a computer science/IT degree, I've been trying to remember what my first encounters with computers were like. I've been using Mac computers all my life and since it's Apple's 30th birthday this year, I thought it'd be a good time to try and write about my Mac usage over the years.

My parents had a lot of historic Mac computers. In fact, even the computer that got used on a daily basis probably became historic after a while, considering the amount of time it took before OS X was adopted in our household (I don't know the year exactly, but I can estimate).
One of our historic computers still turned on. It was a Macintosh LC and we referred to it as 'the Dinosaur'. I have no idea what I actually did on it, but I remember the computer itself vividly - especially the purple stripes on the screen when it failed to boot (which was most of the time) and the creaking noises it made.

There's a long list of computer failures in my memory: dead backlights, video cards, hard drives, power cords, you name it. I experienced several iBook deaths and I still remember the events leading to them in detail. One iBook started making a horrible grinding noise; I'm pretty sure the hard drive got replaced shortly afterwards. One refused to boot after I'd been carrying it around all day in 40 degree weather. One particularly bizarre and prolonged death involved behaviour similar to a broken record: everything would freeze, weird graphics glitches would appear and the last few seconds of any playing audio or video would loop endlessly.

I think the iBook that replaced that one was my last, and it was a great little computer. After my mum gave me her old Macbook, I gradually stopped using the iBook, which became more and more lethargic and eventually died peacefully from loneliness.

About a year ago, the trackpad on the Macbook became very slow and unresponsive and eventually got 'stuck', forcing me to turn it off and use a mouse instead. The trackpad functionality intermittantly returned, and currently seems stable. Meanwhile, during the last couple of months something inside the laptop has started making a gentle growling noise, leading to the nickname "ol' growly".

That Macbook is 8 years old now, still running Tiger, and it's going a similar way as the last iBook. I have a policy of turning it on every now and then to keep it alive, but it's hard to find a reason to do that. Its replacement, running Kubuntu, is the first non-Mac I've ever had. I miss many things about OS X, but I'm enjoying the Linux Experience. Er, I mean using Linux.

Friday, 14 March 2014

In defense of the command line and the humble text editor

I recently embarked on a new university subject - an introduction to Java programming - as part of my CS/IT degree, and cringed slightly when I heard we were advised (though not forced, thankfully) to use Eclipse. I've had some experience with Eclipse already, using it for Android development, and my overall impression was that it has almost as many features as it does bugs/'features'.

I wasn't looking forward to battling with that again, but it occurred to me that a lot of the things I find annoying about Eclipse could probably be customised/turned off in the preferences, so I decided to give it a go. After all, it does have some useful features, and I thought manually compiling code from the command line would be impractically tedious for assignments later on in the course.

In spite of some ridiculous issues which I can't be bothered relating (long story short: the only version of Eclipse I can use crashes if I try to modify the preferences), for the first week of my new subject I did, in fact, use Eclipse. If I'd been able to do so without it crashing, I might have turned off a lot of its autocompletion features and other things that save typing and make for lazy coding. However, it's hard to resist being lazy, and the way Eclipse is designed also means it's almost difficult to write classes and main methods yourself. Eclipse wants to do everything for you. That's very nice - and very bad for a newbie like me. I'll explain why in a minute.

Today I finally decided I wasn't going to put up with Eclipse anymore, so I moved all my source code out of the Eclipse workspace into appropriate folders for each week of my uni subject, opened up a terminal and Sublime Text, and tried to write a main() method.

I realised I had no idea how to.

It's not like I haven't seen hundreds of them by now; it's just that I've never had to remember how to write them because I've never needed to write them. Eclipse auto-generates these things.

I spent all day manually compiling and running Java code, and I've learnt more than I did in a week of writing similar programs in Eclipse. I consider it a privilege to be able to learn by trial and error, see exceptions and syntax errors pop up and try to figure out what's going on all by myself without the little lightbulbs and red 'X's that Eclipse gives you. I relish the fact that I can leave a variable unused for as long as I like without being given a wiggly yellow line and a warning that says 'you haven't used this variable yet!' I enjoy being able to make mistakes without being admonished. But most of all, I love having to remember how to do things.*

It turns out that Sublime Text - an excellent programmer's text editor - has many of Eclipse's better features, including autocompletion, so perhaps it's not so humble after all. However, the fancier features are unobtrusive enough that I barely noticed they were there.
The only autocompletion I use is for variables and methods I've already typed out at least once; Sublime Text doesn't seem to offer autocompletion for things you haven't already typed. I discovered an exception to that rule (and there are probably others) - it does offer autocompletion for the main() method, which I only allowed myself to use once I was confident I knew how to write it by myself. There's also an option for auto-closure of brackets and strings, but that was one of the features I hated in Eclipse so I'm leaving it off. For now, closing brackets and quotes are too firmly embedded in my muscle memory.

It's slightly scary to realise that if I'd gone on letting Eclipse practically write code for me, if someone had sat me down in front of Vim and asked me to write a Java program I probably wouldn't have been able to.** Until I know enough about Java to be able to work on the kind of big projects that Eclipse is essential for, I refuse to succumb to laziness.

*the whole 'not crashing' thing is pretty great, too.
**without googling anything!

Saturday, 22 February 2014

Lazy piano technique

While practising the piano recently, I observed something in my technique which I'd never really paid much attention to before, although I know it's always been there. It struck me as interesting enough to blog about, although it may bore you to death.

For the purposes of this post, I'm going to coin 2 terms to describe different kinds of piano technique: "controlled" technique and "free" technique. The former is very precise and conscious, while the latter is very automatic and probably best described by the word "lazy". This is a massive over-simplification of piano technique, but I think it's a good summary of the two main techniques I find myself utilising. Both have their place, with different passages and even entire pieces sounding better with different techniques.

Controlled technique is what I use whenever I'm learning a new piece or section of a piece, especially in the early stages of metronome practise at very slow speeds, which is the technique I use to learn particularly challenging passages. The controlled technique essentially means thinking about precision, about where exactly where each finger needs to go, leaving nothing up to chance. It also means a small amount of tension in the fingers - ONLY in the fingers, not in the wrist or arm, which would be catastrophic and lead to RSI.

During the early stages of learning a passage, since I won't have developed muscle memory for that passage yet, playing it requires a certain amount of thinking (not something I normally do much of when playing the piano), so controlled technique would be inevitable even if I wasn't making an effort to hit exactly the right notes in order to learn the passage accurately.

Even once I've learnt a passage, initially - usually during the memorisation process, if I haven't automatically memorised it by that time - I'll play using this 'controlled' technique. This is probably mostly due to the need to maintain absolute accuracy while memorising, so I don't memorise mistakes.

While controlled technique greatly increases accuracy and clarity, it is also very risky. I believe many pianists, myself included, rely largely on muscle memory when performing. For me at least, this is something that happens without thinking - it's automatic, and being conscious of where to put my fingers or what's happening in the music actually gets in the way of letting the 'automation' take over. So once I've developed muscle memory for a piece, controlled technique feels uncomfortable, because even that tiny bit more consciousness that's required to be absolutely precise about where your fingers are going can result in a memory lapse. I can't speak for anyone else, but for me thinking really is my worst enemy when it comes to playing from memory.

Contrast this with what I'm going to call "free" technique (you might as well call it "lazy" technique). This is how I usually play something I know well from memory. It involves absolutely no conscious thought, allowing muscle memory to take over completely, and this is aided by complete relaxation of the hand and fingers, which results in a very different (though not always desirable) tone.

The downside of free technique is that over time, accuracy tends to suffer: when I've just learnt a new passage using controlled technique I've practised accuracy a lot, but once I start playing using free technique mistakes creep in and become embedded in muscle memory, accumulating until eventually I'll need to revert to controlled technique for a bit - or even do some slow metronome practise - to get back to the original standard of accuracy that I had immediately after first learning the passage.

The advantage of free technique, however, is that it greatly reduces the risk of memory lapses, as long as my muscle memory is sufficiently well established. Under pressure, such as when performing, I'll tend to fall back on free technique to get me through, which results in lower accuracy but a much greater chance of reaching the end without forgetting anything. It's a sort of mindlessness which is very useful to have when you're nervous.

That was a very long preamble to what I've been meaning to write about all along. For over a year I've been learning Rachmaninov's 2nd piano concerto, specifically the 1st movement (the 2nd I've already performed, albeit without an orchestra). A year is a long time to have something memorised, and for the reasons mentioned above my accuracy has been deteriorating slowly over that period, in spite of improvements in my technique as a whole.

The other night I was practising this movement and suddenly became aware of something which I never really took note of before: while struggling with a passage whose accuracy needed improvement, I found myself playing with a very different technique in order to try to get the notes right. My fingers became tenser, I found myself thinking more about which keys I had to hit, and the tone I was getting out of the piano changed completely (in a good way). In other words, I was playing using controlled rather than free technique, and I much preferred the result.

Once I realised this, I tried to play everything using controlled technique, but felt on the verge of a memory lapse constantly because I was THINKING about the notes for once, rather than letting muscle memory carry me along.

Normally in the course of playing something from memory, I'll constantly switch between controlled and free technique, using free technique most of the time but changing to controlled technique for passages I know are particularly technically challenging. I can actually provide some specific examples of this in the concerto I'm learning: I always use controlled technique in the fast "Un poco piú mosso" section, and always use free technique in this subsequent section. It was the 2nd example here that I was having trouble with the accuracy of, and which from now on I am going to try to play with controlled technique (as I did initially, when I first learnt it).

Although it could be very difficult to strike the balance between passages of the concerto I know well enough to safely be able to play them with controlled technique and passages where controlled tehnique is likely to result in a memory lapse, this is what I would like to try to do before performing it in a piano competition later this year.

Saturday, 11 January 2014

Algorithmic atonality

WARNING: I have no idea what I'm doing and errors probably abound.

I recently came across this incredible graphical and musical representation of sorting algorithms.

It got me thinking about the musical possibilities of such algorithms, and I immediately set about experimenting, taking a pseudo-random sample of 8 notes (the 5 notes of the pentatonic scale plus 3 arbitrary ones) to create an 8-tone sequence or tone row (aided by Python's random module.)

The sequence is split over 2 bars in rather the manner that hexadecimal splits bytes into two 4-bit groups (nibbles). No note is ever repeated within the 2 bar sequence.

The random sequence is first copied and pasted into another 2 bars, before sorting starts.
Each 'nibble' is then processed separately. In the case of my particular tone row, it takes 3 repetitions to sort a bar.

We start by sorting just the 2nd bar of the sequence, meaning the 1st bar remains the same for 3 iterations. The final note in the 2nd bar is compared with the previous note; if the previous note is of a higher pitch, the notes are swapped. The sequence is then copied and pasted again, and the following algorithm is used (in this context, 'greater than' means 'at a higher pitch than'). For demonstrative purposes, let's label our 4 notes A, B, C, and D (this has no relation to actual note pitches!)

D > C > B >A

If this statement is false, the first note to break the rule is compared to each of the previously processed notes to determine its place in the sequence.

This is repeated for the final iteration, by which time all the notes in the 2nd bar in the sequence are sorted. The process is then repeated with the first bar in the sequence, while the 2nd bar  remains the same for three iterations.

The LH starts off in the same direction as the RH, but after a few repetitions it retrogrades (is flipped horizontally, in visual terms).

3rds, 5ths and octaves are added to the left and/or right hands at various points. Sometimes 5ths are added below the pure tone row, and sometimes they are added above it. These patterns can also be flipped vertically so that 5ths that were added below the 'melody' are now on top of it, meaning their intervalic relationship changes.

There are rules for these 'filler' notes:

The note must be either a major or minor third, or a perfect fifth, apart from the lowest note in the chord. it must not be the same as any other uppermost or lowermost note in the bar. This means that sometimes (although very rarely), where the overall pattern is using thirds, it becomes necessary to subsitute a 5th in to avoid doubling a note that already exists in the 'melody'.
The fact that this occurrence is so rare - only starting near the very end of the sorting process - fascinates me, although I don't know the reason behind it yet.

The result of my experiment - which I never ended up doing anything interesting with - can be heard here. I cannot stress enough that I don't know what I'm doing. This was just for fun.

Monday, 16 December 2013

notes vs. tones, digits vs. numbers

There is a little linguistic problem that I've been puzzling over quite a lot recently. It concerns what can probably be best described, in abstract terms, as differentiation between quantity and value in sets.

I first ran into this with number theory, working with the different numerical bases - hexadecimal, binary, decimal - that are relevant in computer science, as I needed to be able to refer to the length of, say, a bit pattern or hexadecimal number separately to the actual values that appeared in it.

In binary this is pretty easy, as there are only 2 possible values. However, if I said 'a byte has 8 digits', and you didn't have the assumed knowledge about binary,  it could be misunderstood to mean a byte has 8 values, which would be incorrect, since a byte can only contain some combination of 2 values: 0 and 1. This confusion arises out of the ambiguity of the term digit: does 'digit' refer to the value of an item in the byte, or to the number of items in the byte regardless of their value? In this case, if you know anything about binary, the meaning is obvious, but there are other similar situations where it may not be.

It seems that most often digit is understood to refer to a quantity of items and number is understood to refer to the value of any item. However, all the dictionaries I've checked in seem reluctant to make such a clear distinction, the two words are listed as synonymous, and in reality they are often used interchangeably, making use of either one subject to misinterpretation.

The problem crops up again in music. For example, when we say '4 notes', are we referring to any 4 instances of an item (for example, four B flats), or are we referring specifically to 4 unique values (for example, a set of 10 items in which only four values, say A, F, D and B flat, occur)?
Officially there are separate words to describe quantity and value in music: note is to tone* what digit is to number - the former describes a quantity, the latter a unique value. But again, the two terms get used interchangeably, making it difficult to ensure any description of musical patterns is absolutely unambiguous. What am I missing here?!

In writing this blog post, I made an unsettling discovery: I kept trying to use general words that describe quantity or value, only to discover they could potentially be misinterpreted. Range is one such word. I was initially going to use this term instead of quantity, until I realised it could be misinterpreted in much the same way as digit or number. Are we referring to the number of items in the set, or the number of values occurring in the set? Range is perfectly correct in either context.

Also note how it's almost impossible to discuss these issues without using that pesky term number. In this post I've replaced as many occurrences of the word as possible with quantity, but I've left the ones in the previous paragraph untouched to demonstrate how much we rely on potentially ambiguous language.

One final thought, venturing into even more abstract territory: consider that we say 'number of digits'. This requires some recursive thinking: the digits are a set in which various values, known as numbers, can be stored. Another set, called a number, contains the digits which contain the numbers...see the problem with this terminology?

*gotta love that tone is an anagram of note eh

Friday, 29 November 2013

The CHORD of resignation

Before you read any further, please be aware that none of this post will make sense unless you've read this first.

The minor 7th chord consists of a stack of alternating major and minor thirds: min 3rd, maj 3rd, min 3rd, maj 3rd. It's always been my favorite chord, and at the time of writing the post linked above I was even vaguely aware that there was some connection between this chord and the chord progression I was analysing. However, I was so new to harmonic analysis at that time that I was happy to leave the analysis at a series of chords.

It's only very recently that I've been able to piece together and understand some of the connections I've always sensed existed between a handful of musical patterns and elements. This post is an attempt to explain these connections, with the aid of some examples. I still have much to learn about this topic, and I'm sure there will be many more blog posts to write as I make new discoveries.

Let's say we're in the key of B flat minor. In this key, the notes which correspond to the degrees of the scale that form the Chord Progression of Resignation are B flat, D flat, E flat and G flat, from the bottom up:

Let's now invert this chord to the 2nd inversion, so it starts on E flat (note that the dominant of E flat is B flat...) The resulting chord is E flat minor 7th:

The notes used in the examples above are 4 of the 5 notes of the pentatonic scale. If we were still in B flat minor (which we're not, since E flat is our new tonic), the missing note (an A flat) would form the 7th degree of the scale - a fairly common addition to the 'pure' chord progression of resignation.
The pentatonic scale is the mode you get from playing only the black keys of the piano (although it can be transposed into any key). It's an interesting scale in that it's very pure sounding - you can combine any of its 5 notes, or play them all at once, and as long as you stay within that mode nothing will ever sound jarringly dissonant.
It's also quite tonally ambiguous - shifting from major to minor and between keys is effortless. I've yet to figure out what gives the pentatonic scale this ambiguous quality, as - depending on which note you begin it on - the degrees of the scale that are 'missing' vary.

The pentatonic scale is prevalent in the traditional music of many vastly different and geographically separated cultures. This can hardly be attributed to coincidence, and the discovery of its relationship to the 'chord of resignation', and by assosciation the chord progression of resignation, has only reinforced my impression that there's something fundamentally significant about the pentatonic scale.

Below are some examples of the 'chord of resignation'. As with the chord prog list, I'll add to this over time, so check back! (Quite a few examples I could easily include here would double with ones already in the chord prog list, so I'm leaving some - though not all - of them out.)

To make collecting examples easier, I'm attempting to group them by harmonic structure a little.
The following examples simply use the minor 7th chord in its purest form:

1. Leo Ornstein - Piano Sonata No. 4, 2nd mvt (see 0:07, and probably most prominently 0:14) Aside from the timecodes noted, the opening of this appears to make use of the minor 7th constantly in other ways too complex for me to try to analyse yet. This piece already appears on the chord prog examples list, but I had to repeat it here because it has such a wealth of interesting harmony.
2. Ravel - Le Gibet (see 13:42 and several times again until the end)
3. Gershwin - Summertime (see 9:11)
4. Leo Ornstein - Piano Sonata No. 4, 4th mvt The opening of this is practically built out of minor 7th chords (or stacks of 3rds, in any case).
5. Steve Reich - Six Marimbas (see 6:15 onwards) The uppermost note of this chord shifts constantly between the 7th and 8th degree of the scale, while the underlying 3rds remain 'fixed'. Interestingly, the tonic doesn't appear in the bass until 6:32, and it dissappears again at 13:42. The inversion of the minor 7th created by this is the one shown in the first image above - the degrees of the chord prog of resignation stacked on top of each other in order: I, III, IV, VI.
6. Ravel - Une barque sur l'ocean (see 4:47)
7. Ravel - Noel des Jouets (see 2:26)
8. Ravel - Ondine (see 5:20) This example is really in a class of its own, as the chord is not only broken into a myriad of semiquavers and tuplets with a scattering of arbitrary notes in between, but it also isn't even a minor 7th. Nevertheless, a Chord of Resignation it undoubtedly is. A far more conventional example can be found at 5:49.

A very common way in which the minor 7th chord manifests - especially in minimalism - is where a particular interval or combination of intervals are maintained over the top of a changing base chord progression, resulting in the 'Chord of Resignation' seeming to grow naturally out of a pure tonic triad. The following examples demonstrate this.

9. Stellardrone - In Time This is a fairly simple example - harmonically and texturally - so well suited for explanatory purposes. The bass is progressing as follows: I, [V], VI, VII. To start with the...er, constellation of notes being repeated over the top is simply part of the tonic triad, but as it remains the same while the bass changes, it forms a minor 7th over the VI chord.
10. Porcupine Tree - Trains As with the previous example, the upper notes in the harmony here remain constant over a changing bassline, resulting in a minor 7th forming over the VI chord. However, there are other relevant complexities to the harmony which you can read about in the chord prog examples list.
11. Steve Reich - Electric Counterpoint III This is an interesting example because the underlying chord progression consists only of the degrees of the scale that form the minor 7th - IV, VI, Im. As a result, in this instance it's impossible to say that a minor 7th is only formed over, say, the VI chord, as was the case with the previous 2 examples. The harmony just morphs organically, an effect intensified the gradual introduction of each degree of the scale in the bass at the start of the piece (you have to listen to the whole thing to get what I'm talking about). PS in case you're curious what happens when it briefly modulates, the progression is III, IV, V, but effectively that III chord becomes the new tonic.
12. Steve Reich - Music for large ensemble Initially, the uppermost note of the chord is simply the 3rd degree of the tonic triad (i.e. the dominant), but each time the bass plunges down a third it becomes a minor 7th.

Yes, there is a lot of Ravel on this list. :P