I have started numbering this series from zero out of habit, because that's how SuperCollider counts (as do, I think, most most programming languages except lua.?!). I have changed the way I number things in my daily life so that I don't experience so much cognitive dissonance. Therefore this is part 2, i.e. the latest in a series of three. Hm. (I guess this is how some parts of the world number their floors: 0 is ground floor, 1 is the next one up, etc., which seems logical enough.)
Speaking of three, there are (I think) three things I want to accomplish in this part.
Maybe you can see where I'm taking inspiration.... Anyway I actually think I will tackle these things in reverse order.
Assuming you have followed along so far, or installed my final classes from the previous installment, this should work:
( // 0. synth and pattern definitions s.waitForBoot { SynthDef(\sin, { |out, freq = 100, gate = 1, amp = 0.1, preamp = 1.5, attack = 0.001, release = 0.01, pan, verbbus, verbamt, vibrato = 0.2| var env, sig; var lfo = XLine.ar(0.01, vibrato, ExpRand(0.5, 2.0)) * SinOsc.ar(5.4 + (LFDNoise3.kr(0.1) * 0.5)); gate = gate + Impulse.kr(0); env = Env.adsr(attack, 0.1, 0.4, release).ar(2, gate); sig = SinOsc.ar(freq * lfo.midiratio) * env; sig = (sig * preamp).tanh; sig = Pan2.ar(sig, pan, amp); Out.ar(out, sig); Out.ar(verbbus, sig * verbamt); }).add; ~pattern = Pbind( \instrument, \sin, \freq, (Pfunc({ { (100, 150 .. 650).choose + rrand(0.0, 3.0) }.dup(rrand(1, 2)) }) * Pdup(Pwhite(1, 4), Prand((0.5, 1 .. 3.5), inf))), \dur, Pbrown(0.0, 1.0).linexp(0.0, 1.0, 0.01, 1.0) * Pwhite(0.8, 1.0), \pan, Pbrown(-0.5, 0.5, 0.1), \dummy, Prand([1, Rest()], inf), \legato, Pwhite(0.0, 1.0).linexp(0.0, 1.0, 0.01, 1.0) + Pwrand([0, 2, 10], [0.9, 0.09, 0.01], inf), \attack, Pbrown().linexp(0.0, 1.0, 0.0001, 1.0), \release, Pbrown(0.0, 1.0, 0.25).linexp(0.0, 1.0, 0.05, 0.5), \vibrato, Pwhite(0.1, 0.3), \preamp, Pbrown().linexp(0.0, 1.0, 1.0, 1.5), \db, Pbrown(-32, -18, 1.0) ); }; ) ( // 1. timeline definition ~timeline = ESTimeline([ ESTrack([ ESPatternClip(0, 5, ~pattern, 1), ESPatternClip(6, 2.2, ~pattern, 2), ESPatternClip(11, 10, ~pattern, 35), ]) ]) ) ( // 2. visualization Window.closeAll; ~window = Window("Timeline", Rect(0, Window.availableBounds.height - 500, Window.availableBounds.width, 500)).front; ~view = ESTimelineView(~window, ~window.bounds.copy.origin_(0@0), ~timeline); ) // 3. play and stop the timeline ~timeline.play ~timeline.stop
We can see that broadly our timeline plays the contents of these clips at the appropriate times, but it would be great to see some visualization of the actual notes it is playing.
So the first thing we need from each clip is some simplified representation of the pitch and volume data it contains. Since this will be somewhat CPU-intensive to call frequently, we will make a variable to cache the result of this, and when we want to re-calculate the data, we will set this variable to nil.
/* ESPatternClip.sc */ var drawData;
Now we will make a drawData method that will return the cached data if it exists, and otherwise calculate it.
• if the clip is not seeded, we will need to figure out a way not to just spit out random data. For now, I will ignore this possibility.
/* ESPatternClip.sc */ // data to be drawn on the clip drawData { // variables we will use later var stream, t; // return cached data if it exists if (drawData.notNil) { ^drawData; }; // otherwise calculate it stream = this.patternToPlay.asStream; drawData = []; t = 0.0; // keep adding events until we're past the duration (or the stream has run out) while { t < duration } { // use default Event as the proto event (we need this to be sure of calculating the keys later) var event = stream.next(Event.default); if (event.notNil) { // if the stream has not run out: var simpleEvent = event.use { ( freq: event.freq, amp: event.amp, sustain: event.sustain, dur: event.dur, // because it doesn't work to specify the isRest parameter directly, as // mentioned in part 0 restdummy: if (event.isRest) { Rest() } { 1 } ) }; drawData = drawData.add(simpleEvent); t = t + simpleEvent.dur; } { // if the stream has run out: t = inf; }; }; ^drawData; }
This basically filters out the relevant information we will need to draw the notes on our timeline.
Recompile, then:
~timeline.tracks[0].clips[0].drawData
to see the raw data we will be visualizing. Each event is a note, with freq, amp, sustain (how long the note sounds), and dur (how long until the next note?). The event will return true to .isRest if any of the parameters, including restdummy is a Rest object.
The first thing we will do is add an endTime method to the timeline view:
/* ESTimelineView.sc */ endTime { ^startTime + duration; }
now, we revise the init method of our track view:
/* ESTrackView.sc */ init { |argtrack| var timelineView = this.parent; track = argtrack; this.drawFunc_({ |view| Pen.use { // reverse the clips so earlier ones can draw on top of later ones. track.clips.reverse.do { |clip, j| // only draw clips in the timeline view bounds, for efficiency if ((clip.startTime < timelineView.endTime) and: (clip.endTime > timelineView.startTime)) { var left = timelineView.absoluteTimeToPixels(clip.startTime); var width = timelineView.relativeTimeToPixels(clip.duration); // call the clip's draw method clip.draw(left, 2, width, this.bounds.height - 4); }; }; } }); this.acceptsMouse_(false); }
and this is just giving the clip responsibility for its drawing function.
Moving the previous drawing code into the clip class:
/* ESClip.sc */ // draw this clip on a UserView using Pen draw { |left, top, width, height| Pen.color = Color.hsv(0.5, 0.5, 0.5); Pen.addRect(Rect(left, top, width, height)); Pen.fill; // if it's more than 5 pixels wide, call the prDraw function if (width > 5) { this.prDraw(left, top, width, height); }; } // and you need to make a dummy prDraw method: prDraw { }
And now a prDraw method for the pattern clip class:
// draw the pattern data onto a UserView using Pen prDraw { |left, top, width, height| var t = 0.0; this.drawData.do { |event| if (event.isRest.not) { var x = left + (t * width / duration); var eventWidth = event.sustain * width / duration; var eventHeight = 2; event.freq.asArray.do { |freq| var y = freq.explin(20, 20000, height, top); Pen.color = Color.gray(1, event.amp.ampdb.linlin(-60.0, 0.0, 0.0, 1.0)); Pen.addRect(Rect(x, y, eventWidth, eventHeight)); Pen.fill; }; }; t = t + event.dur; }; }
Now, when you recompile and execute the "where we left off" code above, you should see this:
Zoom in to see the finer detail:
I am hoping that this framework using prDraw in each clip subclass to define the way that particular type of clip should be drawn will serve us well down the line.
First we will add a convenience method togglePlay to our timeline class:
/* ESTimeline.sc */ togglePlay { if (this.isPlaying) { this.stop } { this.play } }
And now add to the timeline view init method:
/* ESTimelineView.sc */ init { |argtimeline| ... this.keyDownAction = { |view, char, mods, unicode, keycode, key| if (char == $ ) { timeline.togglePlay }; }; }
Now, when you recompile again and execute the "where we left off" code above, you can start and stop the playback by pressing the space bar.
First I will add a transparent UserView on top of our whole timeline view, onto which I will draw a 2-pixel wide rectangle representing our timeline's now value.
( ~playheadView = UserView(~view, ~view.bounds.copy.origin_(0@0)) .acceptsMouse_(false) .drawFunc_({ Pen.use { var left = ~view.absoluteTimeToPixels(~timeline.now); var height = ~view.bounds.height; Pen.addRect(Rect(left, 0, 2, height)); Pen.color = Color.black; Pen.fill; }; }); )
and now a Routine that updates this value 30 times per second:
( var waitTime = 30.reciprocal; // 30 fps ~playheadRout.stop; ~playheadRout = { inf.do { ~playheadView.refresh; waitTime.wait; }; }.fork(AppClock) // lower priority clock for GUI updates )
So now when you press the spacebar, you will see the playhead moving along in time with the playback. Well, almost. If you are using the default server settings, if you look carefully you will notice that the sound you are hearing occurs 1/5 of a second after the playhead hits. This is because of server latency, a small intentional delay applied automatically when a pattern is played.
Server.default.latency // -> 0.2
The reason for this, and its implementation, goes pretty deep. Patterns automatically apply this latency, and when you wrap a server command in an s.bind { ... } (which we will get into later), this latency is applied. In general, it is a good thing to apply some latency (the default 0.2 is a pretty high value) with server commands, since the timing of sclang -> OSC -> server is prone to some jitter. The latency provides a time buffer for messages to arrive so that the relative timing of the sounding events stays in sync.
• For more information about server latency, see this thread on the user forum started by Nathan Ho, or his SuperCollider Tips blog post.
So what I'm thinking is, I will add a method to the timeline class that represents the "sounding" now: (i.e., when the timeline is playing, this will be now minus the default server's latency -- for simplicity I will assume for now that everything is played on the default server -- adjusted by the play clock's tempo).
/* ESTimeline.sc */ soundingNow { if (isPlaying) { ^max(playbar, this.now - (playClock.tempo * Server.default.latency)); } { ^this.now; } }
Now we will visualize this playhead in black, and when the timeline is playing, we will also see the "scheduling playhead" in light gray:
( Window.closeAll; ~window = Window("Timeline", Rect(0, Window.availableBounds.height - 500, Window.availableBounds.width, 500)).front; ~view = ESTimelineView(~window, ~window.bounds.copy.origin_(0@0), ~timeline); ~playheadView = UserView(~view, ~view.bounds.copy.origin_(0@0)) .acceptsMouse_(false) .drawFunc_({ var left, height = ~view.bounds.height; Pen.use { // sounding playhead in black left = ~view.absoluteTimeToPixels(timeline.soundingNow); Pen.addRect(Rect(left, 0, 2, height)); Pen.color = Color.black; Pen.fill; if (timeline.isPlaying) { // "scheduling playhead" in gray Pen.color = Color.gray(0.5, 0.5); left = ~view.absoluteTimeToPixels(timeline.now); Pen.addRect(Rect(left, 0, 2, height)); Pen.fill; }; }; }); )
The only reason I want to keep the "scheduling" gray playhead at all is that we will be looking at sequencing functions soon and I anticipate it will be a little complicated getting them to line up properly with the sounding events. (If you have a function that calls s.bind { ... }, then you want it to execute at the scheduled time. But if you have a function that e.g. changes the color of a light or sends a MIDI note, it should happen in sync with the sounding events.) But I will get to this later.
Now I would like to only have our playhead refresh routine running while the timeline is playing (otherwise it updates 30 times per second when nothing is changing). In order to do this, we need to know when the timeline plays, and when it stops. We will modify our timeline play and stop methods to notify dependants when the value of isPlaying is changed:
/* ESTimeline.sc */ stop { tracks.do(_.stop); //playbar = this.now; isPlaying = false; this.changed(\isPlaying, false); } play { |startTime, clock| // stop if playing if (isPlaying) { this.stop }; isPlaying = true; this.changed(\isPlaying, true); ...
Now we recompile, and here is what we can do with this:
( var stop = { // stops the playhead routine ~playheadRout.stop; ~playheadView.refresh; }; var play = { // our playhead routine code from before var waitTime = 30.reciprocal; // 30 fps ~playheadRout.stop; // just to make sure ~playheadRout = { inf.do { ~playheadView.refresh; waitTime.wait; }; }.fork(AppClock) // lower priority clock for GUI updates }; // this will be called whenever ~timeline is changed ~updateFunc = { |timeline, what, value| if (what == \isPlaying) { if (value) { play.(); } { stop.(); }; }; }; ~timeline.addDependant(~updateFunc); )
And to remove the dependant:
~timeline.removeDependant(~updateFunc);
and now the playhead doesn't update when you press play.
While doing this I added a bit of code to move the playhead to the click point if there was no drag during the click and the timeline isn't playing. Otherwise it's all repurposed code from before.
/* ESTimelineView.sc */ ESTimelineView : UserView { var <timeline; var <trackViews, <playheadView, playheadRout; var <startTime, <duration; var <trackHeight; var clickPoint, clickTime, scrolling = false, originalDuration; *new { |parent, bounds, timeline, startTime = -2.0, duration = 50.0| ^super.new(parent, bounds).init(timeline, startTime, duration); } init { |argtimeline, argstartTime, argduration| var width = this.bounds.width; var height = this.bounds.height; startTime = argstartTime; duration = argduration; timeline = argtimeline; trackHeight = (height - 20) / timeline.tracks.size; trackViews = timeline.tracks.collect { |track, i| var top = i * trackHeight + 20; ESTrackView(this, Rect(0, top, width, trackHeight), track) }; // playhead view from before playheadView = UserView(this, this.bounds.copy.origin_(0@0)) .acceptsMouse_(false) .drawFunc_({ var left; Pen.use { // sounding playhead in black left = this.absoluteTimeToPixels(timeline.soundingNow); Pen.addRect(Rect(left, 0, 2, height)); Pen.color = Color.black; Pen.fill; if (timeline.isPlaying) { // "scheduling playhead" in gray Pen.color = Color.gray(0.5, 0.5); left = this.absoluteTimeToPixels(timeline.now); Pen.addRect(Rect(left, 0, 2, height)); Pen.fill; }; }; }); this.drawFunc_({ var division = (60 / (this.bounds.width / this.duration)).ceil; Pen.use { Pen.color = Color.black; (this.startTime + this.duration + 1).asInteger.do { |i| if (i % division == 0) { var left = this.absoluteTimeToPixels(i); Pen.addRect(Rect(left, 0, 1, 20)); Pen.fill; Pen.stringAtPoint(i.asString, (left + 3)@0, Font("Courier New", 16)); } }; }; }).mouseWheelAction_({ |view, x, y, modifiers, xDelta, yDelta| var xTime = view.pixelsToAbsoluteTime(x); view.duration = view.duration * yDelta.linexp(-100, 100, 0.5, 2, nil); view.startTime = xTime - view.pixelsToRelativeTime(x); view.startTime = view.startTime + (xDelta * view.duration * -0.002); }).mouseDownAction_({ |view, x, y, mods| clickPoint = x@y; clickTime = this.pixelsToAbsoluteTime(x); originalDuration = duration; if (y < 20) { scrolling = true } { scrolling = false }; }).mouseUpAction_({ |view, x, y, mods| // if the mouse didn't move during the click, move the playhead to the click point: if (clickPoint == (x@y)) { if (timeline.isPlaying.not) { timeline.now = clickTime; }; }; clickPoint = nil; clickTime = nil; scrolling = false; originalDuration = nil; this.refresh; }).mouseMoveAction_({ |view, x, y, mods| if (scrolling) { var yDelta = y - clickPoint.y; var xDelta = x - clickPoint.x; if (mods.isAlt) { // hold option to zoom in opposite direction yDelta = yDelta.neg; }; duration = (originalDuration * yDelta.linexp(-100, 100, 0.5, 2, nil)); startTime = (clickTime - this.pixelsToRelativeTime(clickPoint.x)); startTime = (xDelta.linlin(0, this.bounds.width, startTime, startTime - duration, nil)); this.refresh; }; }).keyDownAction_({ |view, char, mods, unicode, keycode, key| if (char == $ ) { timeline.togglePlay }; }); // call update method on changed timeline.addDependant(this); this.onClose = { timeline.removeDependant(this) }; } // called when the timeline is changed update { |argtimeline, what, value| // our play/stop logic from before if (what == \isPlaying) { if (value) { var waitTime = 30.reciprocal; // 30 fps playheadRout.stop; // just to make sure playheadRout = { inf.do { playheadView.refresh; waitTime.wait; }; }.fork(AppClock) // lower priority clock for GUI updates } { playheadRout.stop; playheadView.refresh; }; }; } // helper methods: relativeTimeToPixels { |time| ^(time / duration) * this.bounds.width } absoluteTimeToPixels { |clipStartTime| ^this.relativeTimeToPixels(clipStartTime - startTime) } pixelsToRelativeTime { |pixels| ^(pixels / this.bounds.width) * duration } pixelsToAbsoluteTime { |pixels| ^this.pixelsToRelativeTime(pixels) + startTime } startTime_ { |val| startTime = val; this.refresh; } duration_ { |val| duration = val; this.refresh; } endTime { ^startTime + duration; } }
Note that when you register an object as a dependant of another object, it will call the update method on that object. Otherwise you will see that not much has changed from our interactive code snippets.
Now, recompile, and run the "where we left off" code from before, and now you have a real time accurate playhead when you press the space bar, and that you can adjust the starting point by clicking. And it still zooms and scrolls when you click and drag the tick marks. (Or use the mouse wheel.)
This is, for me, the beginning of a usable timeline: zooming in and out, visualizing notes, moving the playhead, and playing and stopping.
The problem of editing is one we will leave for later.
Download my classes from this installment here.
See you in the next one where we add new types of clip that play Synths and Routines.