Tagged in: Tiction

Generative Music Part III – Elysium

ElysiumScreenContinuing my look at generative music tools, here is Elysium, another freeware program, that like Tiction and Nodal, generates MIDI data via whatever MIDI device you have to an external synth, or via Apple’s IAC to a software synthesizer or to your DAW (Logic Studio 9 in my case).

This is relatively new software (all three of these programs are still toddlers, really), and thus has both compelling possibilities capable of rich reward and (like my own toddler) bouts of misbehavior and instances where it doesn’t do what you think it will do. Elysium (screenshot above) was visually inspired (according to Matt Mower, the software’s principal author) by Mark Burton’s multi-touch instrument (made in 2007) called the ReacTogon, which in turn has much in common conceptually with the ReacTable (first presented in 2005). Videos of both are embedded below.

While Tiction and Nodal both offer the option of sending individual nodes or groups of them to a particular MIDI channel, Elysium does this via Layers. In the screenshot above, you can see three layers. Here is an MP3 of an excerpt I created sending Elysium through Logic.

Before I get to the specific pros and cons of Elysium, I want to talk in general terms about a couple of frustrations that have arisen, now that I’ve been playing around with this kind of software for the better part of a month. I offer these observations as grateful feedback to the software developers, and recognize that these are generous folks, really, and smarter than I.

My first gripe is maybe a little specific, actually. And maybe it has as much to do with Apple’s Logic as with anything else. From everything that I can tell by scouring the user boards, Logic 7 used to play a lot better with programs of this nature, in terms of syncing. The software program Noatikl, which I’ll talk about in a later post, has addressed it’s own problems with Logic using a complicated workaround, but Tiction and Elysium both (oops, see comments) use Java (with which Apple has a complicated relationship), and no matter what I try, I can’t get the programs to sync together. If my musical needs, as it were, could be met exclusively within the standalone program and its sounding channel (i.e. that every musical element would be provided by wiring Elysium into Logic, recording the MIDI there and we’re done), there would be no problem. But if I want to add elements from outside the generative program that requires precise timing, like an Apple Loop (or a Dr. Rex loop in Reason 4, which I also tried), for example, then I’m in trouble. For the short MP3 excerpt above, I recorded the MIDI into Logic and then lined up the music’s beat 1 to an actual Logic bar’s beat 1. The music was set for 300 ticks per minute in Elysium, which would be 150 BPM, but because of latency and clock drift, wound up at 148.5 BPM once it was recorded and realigned in Logic. I added some drum loops as a fourth layer, and the alignment was okay. But with a longer stretch of music, that clock drift starts to become unworkable, an experience I had recently when trying to record approximately 7 minutes of material with Tiction and Logic. I haven’t tried it with Pro Tools (and can’t now, until Digidesign gets their Snow Leopard act together – I understand they’ve got a Beta version together now that’s 10.6.1 compatible, but I’ll wait), and it’s my feeling from the user forums I’ve seen that Ableton Live has fewer MIDI sync problems, but I don’t have Live.

The other general thing about Tiction, Nodal and Elysium is the sort of Perpetuum Mobile of it, which can get tiring but can be worked around with various degrees of success in each program, but also points to the utility of good syncing to enable post-recording editing. With Tiction you can simply stop or start any group of nodes at any time without effecting the playback of the other groups. With Nodal and Elysium the workarounds need to be more elaborate. With Nodal and with Elysium, you can set up elaborate timing schemes that effect the timing and probability of a nodal trigger. Setting up probability in Elysium is quite simple, actually. It’s simply one of the dials on offer in the edit menu of an appropriate nodal type.

Elysium could benefit from adapting a little of Tiction’s simplicity in one case here, though, as once you’ve established a pitch network that you like, save it and reopen it (or even stop it once), when you restart the piece, all the layers get triggered at once, offering no possibility of recreating the fun of the experience of building the piece’s density over time. In fact, I edited 2 of the 3 layers that Elysium created after the fact. In the case of Layer 3, which was sparse to begin with, I changed it so much it was hardly like the original at all. And 3 layers was all I dared create. At 300 ticks per minute, the CPU load on Logic was in the red for much of the time, and the timing between the layers regularly became unstable.

The pitch scheme of Elysium is set up following a pattern called a Harmonic Table, where every three adjacent pitches form a triad. There’s nothing particularly restricting about this, though it does mean that 3rds, 6ths and P4ths & 5ths are the only adjacent available intervals and in this program proximity has rhythmic implications that can not be gotten around in the same way that they can be with Nodal. One possibility that I didn’t much explore yet, is the option to play a triad instead of a single pitch. You can choose which combination of proximate neighbors will sound the triad.

There are several interesting features unique to Elysium. One is the Ghost Tone, in this case meaning an adjustable number of (rhythmic) repeats of the triggered pitch, with repeats from 1 to 16 times available. A truly intriguing feature to this program is the possibility of applying LFOs to many parameters: ghost tones, tempo, transposition, velocity, and more. Alas, I could not get the LFOs to work, and the documentation does not include anything about them. The probability feature works very well, though.

I’m guilty of misapplying a term. What is a node in Tiction and Nodal is a PLAYER in Elysium. And while the Perpetuum Mobile aspect is a strong character feature in Elysium, the variety of players makes it possible, to an ear that is willing to tune in to these kinds of changing landscapes and undulations, to inject a fair amount of surprise and change, regardless of how much one needs to swim upstream in order to make that happen.

Once the program is more stable, if I could put on a wish-list a small vanity item, it would be to make the program visually more compelling. How? By, for example, allowing the colors of the table to be customizable; allowing a certain amount of transparency/translucency to exist so that layers can be stacked and all the activity viewable from the top layer (like looking into a pool of dark water from above or something). I’m imagining a dark, translucent field, with 3D lights flashing in patterns in depth layers. But far more important than the cover of the book is its contents. I’ll be looking forward to seeing how this app develops over time.

Generative Music – Part I – Tiction

One of the reasons you’ve been seeing posts from me lately about the graphic arts software programs Inkscape and Processing is because I’m in the planning stages of a multi-movement, electroacoustic, multi-media work that I will write for a flute quartet based in Rīga (and possibly a second group in Göteborg). In any case, I chose as my inspirational starting point the subject of Emergence, the study of how complexity arises in various kinds of systems.

I’ve gotten a hold of various books on subtopics of the subject, such as Steven Johnson’s Mind Wide Open: Your Brain and the Neuroscience of Everyday Life, Scientific American’s collection of articles, Understanding Artificial Intelligence, James Surowiecki’s The Wisdom of Crowds, which I first heard about when listening to the podcast of one of my favorite radio programs, the one for WNYC’s Radiolab.

One of the movements I’m planning will involve projection of an animated, graphic ‘score’ that will be realized/performed by the audience in real-time, accompanied by electronics and the flute quartet. I’ve put myself on the learning curves of both Inkscape and Processing in order to prepare those scores. I’ll talk about my plans for that in another post.

Along the lines of artificial intelligence, I thought I’d try to survey what’s happening with computer assisted (or generated) composition currently, whether algorithmic or not. If I could define the kind of activity going on in this regard right now, I’d break it down into two categories, each with sub-categories: those that require knowing or learning code (such as LISP, see for example, Peter Siebel’s Practical Common LISP, also available at Amazon) and those that are principally driven through a GUI (Graphical User Interface). The subcategories for each of those are FLOSS or FOSS (Free/Libre Open Source Software) vs. Commercial.

I want to talk about my experiences, early impressions, difficulties, or whatever else comes up:
1.) because it will help me process my own thoughts;
2.) if I overcome some technical hurdle (and boy, do they seem to have a way of persistently appearing) I might as well share my solution to save the next poor soul some time, and;
3.) to the extent that it’s offered, receive the wisdom and/or expertise of anyone who comes upon what I’m writing and wants to share.

tiction_topSo that brings me to Tiction, a quite beautiful, freeware “nodal music sequencer,” created by Hans Kuder with Processing. I downloaded the program, and followed the brief instructions at the website. Tiction doesn’t generate sound on its own, so needs to be connected to an external MIDI keyboard or an internal software synth.

There are basically three menus in Tiction:
1.) The Help menu, which is basically a list of keyboard shortcuts for setting up the nodal network, N to create a node, C to connect it to the next one, etc. It’s very straightforward.
2.) The Options menu, which allows you to choose 16 specific pitches according to their corresponding MIDI note number, with a default setting of a C major scale/diatonic collection, the MIDI In/Out connections, sync parameters, the ‘bar brightness’ and ‘do physical actions on trigger’
3.) The Edit menu (reached by selecting a node and typing E, which allows you to select specific parameters for the highlighted node, including MIDI channel, physical actions (such as jiggle, attract, repel), and velocity, among other things.

I first connected it to my external MIDI keyboard via my typical Core Audio MIDI Setup in Mac OS X, selecting it from the Options menu. I created several nodes, connected them, and fired it up. Right away, Tiction made some interesting music, with compelling visuals to go with it. The default behavior dictates that the network of nodes you’ve created drift around the screen, and depending where the network is drifting along the X/Y axis, it will affect the register that is sounded as well as affect the velocity. What that means is that the default mode is really rather musical. Set certain nodes to attract or repel, and the activity on the screen and the music generated become more agitated. Change the pitch collection and its potential broadens again.

I was so excited, I began thinking that it would be great to look into Screencasting software so that I could make a video of Tiction doing its thing and project it for the audience. I would record MIDI into say, three or four MIDI channels in Logic, add, edit, or modify material as I saw fit, and voilá! One movement done! Since there will be a choreographer and some dancers as part of the project, I thought this would make a perfect accompaniment.

Picture 1 I then wanted to try running Tiction through Apple’s Logic, and here I wound up hitting several hurdles, some that were solvable and some that I haven’t been able to yet. First, running Tiction into Logic requires using the IAC (Inter-Application Communication) Bus that comes by default with Audio MIDI Setup in OS X. At first it didn’t work. I tried it with Midipipe. Still no. Since Tiction was made with Processing and since Processing requires conversion into Java, AND since, evidently, there is some lack of support from Apple with Java, I thought the problem may reside within the Java extensions folder. Looking through the (not particularly current) message board at the Tiction website, I decided to buy Mandolane MIDI SPI, thinking it was a long-shot, but since it was cheap, well, okay, and it was. A long shot, that is. Still no. But on the right track. Turns out the only extension necessary is: mmj (because since OS X 10.4.8 Apple no longer supports some java MIDI packages). Download mmj and copy both mmj.jar and libmmj.jnilib into /Library/Java/Extensions.

Finally! I get Logic and Tiction talking to each other. But another head (or 3) grew on the hydra:

1.) I can’t set the nodes to play on different MIDI channels. Whenever I hit “E” and edit the MIDI channel number, no matter what number I enter, it always resets itself to channel 1 as soon as I hit “E” again to exit the editor.
2.) I’m having the same “note off” issues that others reported in earlier versions of the software.
3.) I can record MIDI data into Logic from Tiction, but I can’t get their metronomes to sync up. If I select anything other than “Use Internal Clock” in Tiction, it refuses to play for me.

So, it’s not yet necessarily at the deal-breaker stage for me. Though it would be some work, I could still realign the MIDI data to proper bars and beats to deal with the sync issue. (I don’t know if there’s some clock drift over time or not that might make that more complicated than I think). I could re-orchestrate the MIDI data to whatever channels I want after the fact, though that would be time-consuming, and probably less organic than being able to do it directly from the original. I suppose I could make the MIDI ‘note off’ problem a feature rather than a problem, especially if I choose to involve the flute quartet in some interesting, crunchy way against the held tones. (I could also manually shorten other groups of notes that didn’t turn off.

I posted this issue on the Tiction website. If I get an answer that solves it, I’ll report back. Otherwise, anybody out there already run into and solve this problem?
September 21, update: Problem #1 is solved, with help from Hans Kuder. When changing the MIDI channel in the individual node’s Edit menu, you must use the ENTER key for the change to take effect. The other half of the issue, on the Logic side, is that it is necessary to go to File>Project Settings>Recording and check “Auto Demix by Channel if Multitrack Recording.”
Note Off and Sync issues remain.
February 15, 2010, update: I’ve been meaning to say for a while, that Tiction’s website was down for a while, but now it’s back up. But anyway, Tiction broke under Snow Leopard. Hans Kuder is a ware of the problem and is looking into it. Check his website periodically for an update. I noticed that another, simpler program called MIDI Game of Life, which is also Java-based broke under Snow Leopard too.