Continuing my look at generative music tools, here is Elysium, another freeware program, that like Tiction and Nodal, generates MIDI data via whatever MIDI device you have to an external synth, or via Apple’s IAC to a software synthesizer or to your DAW (Logic Studio 9 in my case).
This is relatively new software (all three of these programs are still toddlers, really), and thus has both compelling possibilities capable of rich reward and (like my own toddler) bouts of misbehavior and instances where it doesn’t do what you think it will do. Elysium (screenshot above) was visually inspired (according to Matt Mower, the software’s principal author) by Mark Burton’s multi-touch instrument (made in 2007) called the ReacTogon, which in turn has much in common conceptually with the ReacTable (first presented in 2005). Videos of both are embedded below.
While Tiction and Nodal both offer the option of sending individual nodes or groups of them to a particular MIDI channel, Elysium does this via Layers. In the screenshot above, you can see three layers. Here is an MP3 of an excerpt I created sending Elysium through Logic.
My first gripe is maybe a little specific, actually. And maybe it has as much to do with Apple’s Logic as with anything else. From everything that I can tell by scouring the user boards, Logic 7 used to play a lot better with programs of this nature, in terms of syncing. The software program Noatikl, which I’ll talk about in a later post, has addressed it’s own problems with Logic using a complicated workaround, but Tiction
The other general thing about Tiction, Nodal and Elysium is the sort of Perpetuum Mobile of it, which can get tiring but can be worked around with various degrees of success in each program, but also points to the utility of good syncing to enable post-recording editing. With Tiction you can simply stop or start any group of nodes at any time without effecting the playback of the other groups. With Nodal and Elysium the workarounds need to be more elaborate. With Nodal and with Elysium, you can set up elaborate timing schemes that effect the timing and probability of a nodal trigger. Setting up probability in Elysium is quite simple, actually. It’s simply one of the dials on offer in the edit menu of an appropriate nodal type.
Elysium could benefit from adapting a little of Tiction’s simplicity in one case here, though, as once you’ve established a pitch network that you like, save it and reopen it (or even stop it once), when you restart the piece, all the layers get triggered at once, offering no possibility of recreating the fun of the experience of building the piece’s density over time. In fact, I edited 2 of the 3 layers that Elysium created after the fact. In the case of Layer 3, which was sparse to begin with, I changed it so much it was hardly like the original at all. And 3 layers was all I dared create. At 300 ticks per minute, the CPU load on Logic was in the red for much of the time, and the timing between the layers regularly became unstable.
The pitch scheme of Elysium is set up following a pattern called a Harmonic Table, where every three adjacent pitches form a triad. There’s nothing particularly restricting about this, though it does mean that 3rds, 6ths and P4ths & 5ths are the only adjacent available intervals and in this program proximity has rhythmic implications that can not be gotten around in the same way that they can be with Nodal. One possibility that I didn’t much explore yet, is the option to play a triad instead of a single pitch. You can choose which combination of proximate neighbors will sound the triad.
There are several interesting features unique to Elysium. One is the Ghost Tone, in this case meaning an adjustable number of (rhythmic) repeats of the triggered pitch, with repeats from 1 to 16 times available. A truly intriguing feature to this program is the possibility of applying LFOs to many parameters: ghost tones, tempo, transposition, velocity, and more. Alas, I could not get the LFOs to work, and the documentation does not include anything about them. The probability feature works very well, though.
I’m guilty of misapplying a term. What is a node in Tiction and Nodal is a PLAYER in Elysium. And while the Perpetuum Mobile aspect is a strong character feature in Elysium, the variety of players makes it possible, to an ear that is willing to tune in to these kinds of changing landscapes and undulations, to inject a fair amount of surprise and change, regardless of how much one needs to swim upstream in order to make that happen.
Once the program is more stable, if I could put on a wish-list a small vanity item, it would be to make the program visually more compelling. How? By, for example, allowing the colors of the table to be customizable; allowing a certain amount of transparency/translucency to exist so that layers can be stacked and all the activity viewable from the top layer (like looking into a pool of dark water from above or something). I’m imagining a dark, translucent field, with 3D lights flashing in patterns in depth layers. But far more important than the cover of the book is its contents. I’ll be looking forward to seeing how this app develops over time.
Hi Charles.
Thanks for the review, you raise some interesting points. It’s actually very late (gone 1am) so I’ll be brief but maybe we could talk further another time?
The first thing is a quick note that Elysium is written in Objective-C, not Java, and uses CoreAudio & CoreMIDI directly. I’m a little surprised to hear you are having timing issues.
Which brings me to my second point. Are you routing MIDI into Logic via IAC? If so that might explain some of the timing issues and is, in any case, unnecessary. Logic will happily accept MIDI direct from Elysium.
Where you might find ongoing timing issues is between layers since there is, at present, no synchronization between the layers of a composition – even at the same tempo.
The last thing I wanted to mention was the LFO’s. At various times in Elysium’s history I have done something that breaks the LFO code. However in the 1.0 release, to the best of my knowledge, they are working (although Elysium itself is having a problem with Mac OS X 10.5.8 right now). If they’re not working for you and you are using 1.0 please let me know.
Thanks again for the review. I will be interested to hear how you get on with the app over time. And feel free to get in touch with me about any issues you’re having.
Regards,
Matt
Hi Matt,
Thanks for the corrections. I tried to retrace my thinking why I said Elysium was in Java and can’t now for the life of me recall why I made that declaration. Sorry.
I downloaded an update this morning from 1.0 to 1.0.0 and the LFOs are working now. (And FYI, I’m using OS X 10.6.1.)
I could have sworn that prior to this update that the tempo dial affected all layers when changed in one. If not, what’s the tempo sync checkbox for?
As for the sync issues, I am still having them. Yes, I was using the IAC. Of course you’re right, I can get the MIDI into Logic directly through Core Audio (I wasn’t doing it that way yesterday). But today, as I look at the MIDI data through MIDI Pipe, all I see coming out of Elysium is noteon/off messages and channel info. There’s no start/stop messages.
So, doesn’t that mean Elysium needs to slave to Logic? and if so, Logic has to send the start/stop and clock info from Logic to Elysium, right? So, doesn’t that mean that some conduit (like IAC) is necessary to get that data into Elysium? Or am I missing something?
Many thanks,
~Charlie
Hi Charles.
More good points. Okay the ‘tempo sync’ checkbox should really be grayed out (or, for now, removed) since it relates to a forthcoming feature and does nothing.
You’re right that, at the moment, there is no sync between Logic (or any host) and Elysium. If you want to record MIDI in you have to start the Logic transport, start Elysium and then shunt the resulting MIDI region to fit where it would have started if there had been sync.
However this is an area of active interest for me and, just this weekend, I have figured out how to at least have the Elysium player start & stop with the Logic transport. That will be implemented in a forthcoming release.
Note that this should not require any use of IAC as Logic can be configured to broadcast MIDI clock.
Elysium has a hierarchical structure. If you set the tempo dial in the player inspector this will set the tempo for all layers. However, for any particular layer, you can override the tempo dial and set a tempo specific to that layer. Are you sure you’re setting it in the right place? And that you haven’t overridden it? If not then maybe there is a new bug.
Slaving tempo is more complicated for a number of reasons. First the MIDI clock method employs a series of ‘clock’ messages which is not especially compatible with how Elysium was designed. Second Elysium’s layers have a certain independence so some kind of ratio/scaling system would need to be introduced.
In terms of layer use I introduced a feature before 1.0 called MIDI override. If you want to create multiple independent patterns then separate layers is the way to go. However if your goal was to be able to trigger different instruments, at different point, from one layer then look into MIDI override (it’s in the Note inspector).
This allows you, for any note, to specify the probability that it will trigger for any of MIDI channels #1 through #8.
I use this to introduce the probability of hits from specific instruments at certain points. Without needing another layer.
My last comment is to say that Elysium is not a simple application although I do strive not to make it any more complicated than necessary. One of the 1.0 goals was to update the documentation and to create some tutorial videos. But I just wasn’t getting around to it and didn’t want to keep holding up the release.
At some point, when I have more time, I’ll do both.
Regards,
Matt
Oh I remember something I wanted to ask. When you talk about “Perpetuum Mobile” are you referring to how the patterns can becoming ‘dull by repetition’ ?
This is a key reason why I pursued both control over probability and LFO’s so early in Elysium’s development – to respond to this issue. MIDI triggers (to allow some combination of human performance and generative work) is another feature I added recently to assist with this and Mutex groups (the ability to setup OR relationships among tokens) is a feature I am working on right now.
I wish I could work on Elysium full-time but, alas, I must pay the bills 🙂
Matt
Oh another point before I get back to work.
You’re right in that the harmonic table does place certain constraints on how you work. However look also into the skip token which I added for another user. This allows you to jump the playhead over a number of cells in one move and can allow you to create irregular movements.
Something I plan to work on quite soon is an expansion of the triad playing into something more akin to a chord sequencer. Right now you can have Elysium play a single, fixed, triad instead of a note. This turns out to be quite boring.
I want to be allow users to create patterns of chords that notes can sequence through although clearly this feature needs some thought to be properly useful and my knowledge of harmonic theory is very limited. I’ll be looking for feedback from users to drive this kind of development.
Okay, I’m done now 🙂
Matt
Something else I thought I’d mention is that Elysium embeds a full Javascript environment. You will notice, scattered about the inspector, a number of buttons for creating callbacks which are implemented as Javascript functions.
This functionality is, so far, little developed but it’s something I am working on at present to expose an API that can be allow the composer to have more influence over their composition.
In a more advanced GUI app you might have timelines and visual editing but I don’t have the time or (presently) the skill for that so I’m going to expose it as code.
For example something I need is to be able to have a generate pulse with every beat, then go to sleep for some number of beats. Right now there isn’t really any way to do that from the UI.
However a simple layer script would be able to check the beat and decide whether to enable or disable any particular generator on a cell.
You could also imagine how such scripts might *add or remove tokens or layers*. For example you could probably write a script to play a peverse form of life, adding & removing generators to cells depending on the number of active playheads.
The possibilities, if you know a little Javascript (the API will be quite simple) should be quite interesting.
Ok, time for lunch.
Matt
Wow, Matt, lots to respond to. First, thanks a million for your thoughtful replies and commentaries.
Again, of course you’re right about Logic: I just need to set the clock to transmit on all ports. But when I think more about this, I recognize that you’ve got a conundrum with syncing, don’t you… given that there’s the possibility of controlling the tempo in Elysium with an LFO, if Logic is the master clock, wouldn’t that override the LFO? Alternatively, you’d need to send that LFO to Logic’s transport. How’d you’d do either one, I don’t know.
As for the Perpetuum Mobile comment, yes, I was referring to the potential for numbing repetition. But today, I played with applying several LFOs to a number of parameters (tempo, transposition, ghost tones, etc) and found the results becoming quite elegant (until it crashed – I sent a crash log to your website).
I’ll need to explore the ‘skip’ option. From your description, I can’t envision precisely how that will work (I understand it in Nodal — in Nodal it’s called a “wormhole.” Nice, right?), but it might become clearer in practice.
I meant to make mention in the main post about the possibility of scripting in Elysium, which is something I have no experience with but might be up for experimenting with. In part I meant to bring it up because it provides a segue to a discussion/description of Noatikl, which really seems to require scripting to get the program to produce something other than ambient music.
One thought I’d had about your triads, could be good or bad, but in any case I imagine it’d be easy given what’s already in the program, is to make an LFO for option for it.
I hope you enjoyed your lunch!
~Charlie
With respect to tempo sync my first plan is to have Elysium slave to the Logic tempo (rather than vice verca). But you’re right to identify that this is not so simple. This is where the tempo sync control will be involved.
A tempo sync’d layer will have a tempo relative to the incoming tempo. That might be 1:1 or it might be expressed in other ratios. An LFO in this case would probably affect the ratio somehow.
Other layers could remain independent and run at their own tempo.
I have in mind to add the concept of a ‘master clock’ to Elysium to drive the layers (at the moment they are too independent) that should make it easier to slave Elysium to external clock sources as well as improving consistency in timing between layers.
Skip is a token that you apply to a cell like generate or note. It’s quite simple to use, add one in the path of a playhead and play with the skip value and you will see what it does. You can add an LFO to the skip value.
I’ve considered adding wormholes in the past primarily as a means to transfer tokens between layers rather than between cells of a single layer (although I see no reason why that shouldn’t be possible also) but it’s not a priority.
Other people have asked in the past for control over the colour scheme and that’s probably something I will do. Transparency also, assuming it’s easy to make the underlying window transparent as well.
Beyond that as someone who is a working composer though I am interested to hear your feedback about how it could develop as a ~musical~ tool. If you have more ideas along those lines I’d be keen to listen.
Matt.
For what it’s worth, wormholes between layers would potentially help a piece evolve more slowly over time and thus be more musical. A user could effectively add new instruments in some order, adding one or two at a time rather than all at once as it is now. I guess you could build considerable offset between the generators on different layers, but…
Also, since you’re welcome to composerly feedback, to have the various layers be *completely* independent isn’t musically useful, because (and all but the most avant garde of composers would agree with this) even the most complicated (but playable) rhythmic relationships are always going to be expressed as ratios with simple integers: 1:1, 2:1, 3:1, 4:1, 3:2, 4:3, 5:4, etc. Therefore, if you offer the possibility of locking additional layers to one of those ratios, the rhythmic possibilities become exponentially more interesting. I imagine that also makes the tempo control issues simpler, no? Because no matter how the tempo changes, the ratio relationship between layers wouldn’t change.
I can see how wormholes can be useful. There are also timers which could be used to enable layers over time and scripts. But I guess the advantage to a wormhole is that it’s easy for the user to use. Okay consider that one bumped up the list a little.
Your feedback about tempo is *exactly* the kind of thing I want to hear. I’m never quite sure how obvious this is but, when I started Elysium, I had no musical background or knowledge whatsoever (never played an instrument) so I made decisions largely as they seemed to make sense to me at the time and, of course, some of those decisions have worked out poorly.
Given that I am interested in tempo syncing with the host (and that this has implications for how layer tempo works) I shall certainly give some thought to how the tempo relationships between layers could be expressed.
I wouldn’t want to remove the possibility of free-running layers but I can see myself how it would be useful to be able to vary the tempo of one layer and have the rest vary their tempo, in different ratios, lockstep to it.
I’m slightly bogged down with a crash problem on 10.5 (but I have seen your bug report, thanks) but hope to work on these issues soon. In the meantime, since it was very simple to do, I have implemented one of your requests. Here’s a teaser for a forthcoming version:
http://screencast.com/t/2ONdDdMWOkrj
Matt.
Wow, Matt, I’m honored that you made a change at my suggestion. I’ll look forward to the next release. In the meantime, if I could apply the Socratic method of questioning… why is it that you want to keep the various layers’ timing completely independent?
As for you not having musical training, don’t sweat it. Sometimes that’s exactly where innovation comes from. And in the larger sense as it relates to technology, it’s an issue that we’re all too close to, chronologically speaking, to have any real perspective on it. But I do know that the relationship between musical *training* and music *making* is in flux right now. History will make up its mind later.