Yesterday, for the second day of the 4-in-4, I made a semi-randomly generated 8-bit song.
Way back during orientation week, a few of us were talking about music, as you do when you’re just getting to know a new group of people. Specifically, Marko Manriquez and I shared our enthusiasm for Aphex Twin. We talked about the incredible variation and detail that shows up in the drum programming in pieces such as Girl/Boy Song and wondered whether such intricately constructed music could possibly have just been made by hand or whether some kind of algorithm helped out.
I speculated that you could accomplish something like that style of non-repeating linear invention by using probability. You would just declare a set of allowed pitches and metric values to be assigned to each instrument and then allow the computer to randomly choose between those over-and-over to compose the piece. That would allow you to shape the aesthetics of the output without having to go in and make all the tiny micro decisions required to through-compose something with as much mind-boggling detail as the drums in Girl/Boy Song. (For the record, I don’t believe that this is actually how Aphex Twin works; I think he actually writes all of that stuff by hand.)
Having had this idea, I sat down during the Tisch Convocation and wrote Whoops, a Ruby library that uses probability to generate scores for bloopsaphone, _why the lucky stiff’s 8-bit music generator. Bloopsaphone uses a very simple text-based score system where, for exaple, “4C” would mean “play a quarter note on C”, etc., which made it very easy to implement this idea in an environment where I could get instant feedback in the form of listenable music.
This was all back in late August of last year. I haven’t touched Whoops since.
So, yesterday, for 4-in-4 I decided to actually use Whoops to create a piece of music. I started by defining a bunch of bloopsaphone sounds: hi-hat, snare, bass drum, lead melody, and bass. Next, I started using Whoops to define what I wanted the drums to do.
If you look at lines 53-57 of that ruby script, you can see the Whoops commands that generated the drums. I’m always having them play C since they’re a percussive instrument anyway and their pitch doesn’t matter. For the bass drum and hi-hat, I mostly want quarter notes (this is Aphex-inspired IDM, after all) so I give “4″ as the most common value in the duration array. I want the snare to feel like it’s largely on the 2 and 4 so I mostly give it half notes in its duration array. And then, I added one more sequence for the hi-hat, “hat_detail”, that plays spastically on small duration increments (16,32, and even 18 and 9 for 16th and 8th note triplets). I gave that sequence mostly rests (the empty string) as its pitches so that it would only play occasionally; I wanted it to be decorative, not totally take over.
Once I had the drums starting to sound how I wanted, I figured out a chord progression for the melody and bass to follow and wrote down sets of notes that they should be playing for each chord. Then, I followed the bloopsaphone API to play the resulting music and also made sure that my script would spit out the actual notes generated for each instrument. That way, each time I ran the script, I’d get a different musical result and if I liked one, I could copy and paste the score for it so I could reproduce it and even modify and improve it if I wanted to.
After lots of runs, I had a few versions of things that I liked. The melody was the weakest. Some runs would have bits of compelling melody in the patterns that happened to come out but it was rare also not to have bits of weird dissonance or just melodic incoherence. So, I went in and edited the melodies I liked best to tweak them into a more compelling shape through classic melodic rules such as repeating patterns that were already there or adding sequence and series. The results sounded like this, for example: whoops_demo_2.mp3.
Here’s the score for that fragment:
Once I had a couple of bits that I liked, I outputted the instruments one at a time to AIFFs using Soundflower and GarageBand and then brought the resulting files into Logic to mix. I was surprised at how easy and fun it was to mix these 8-bit sounds. I wasn’t sure how well they’d take reverb, compression, and the other normal tools of music mixing, but I ended up pretty happy with the sounds that I got.
I didn’t have time to put together a long-scale composition, but I did finish a sketch for a song. I’m calling it “And He Built A Crooked House”. Listen to it here: And He Built A Crooked House.