Breakpoint demolog, day 10: deltaFrames and me

Ahhh so it wasn't going to be that easy peasy! The first time I pressed play (with the Amp envelope turned on and being applied to the output) I just got a crazy distorted sound. The Amp envelope was outputting values in the range of thousands. Why is this, this code used to work?

I checked everything line by line, method by method... all seems fine until I found that I'm getting bad values in the reported time and then interpolating linearly and using a time value which isn't quite in the range you expect gets you wrong results (read: 7000's instead of [0..1]).

In a way, it was something to be expected, since my approach was being very naive and I was not using properly the deltaFrames value that I get each time I receive a new MIDI event (i.e. note on, note off).

It seems that the way VST hosts and MIDI events work is this: the host always processes blocks of N frames length (where N is constant). MIDI events can happen at any point of time, so it is possible that a MIDI event shows up in the middle of one of those blocks. That's what the deltaFrames parameter is for: specifying the offset in frames the received event is going to happen. When playing live with the synth, i.e. playing keys in the keyboard, it seems that deltaFrames is always 0. Which makes sense, because the host can't anticipate when are you going to press or release the key.

Back to my problem: I was sending the note events to the synth as soon as I got them, in the processEvents function, instead of just storing the event and the deltaFrames somewhere, and getting back to them the next time a new chunk of audio data was requested. The result of this wrong approach was that the synth was starting and/or stopping to play before the right moment.

It looks very straightforward now but I have spent quite a lot of time tracking this little silly thing down!

In any case this has given me some ideas for packing the song data later (when I export it from Renoise). In other words, instead of using the classic tracker style, maybe it's easier to export in a more MIDI-style way, by placing one event after the other, for all the channels in the song. The good thing of this approach might be that I could get to know when certain things (for example, instrument 2 playing note A-5) happened and at which exact point in time, since the events would already be ordered by ascending time so that they can be sent from the player to the synth voices in order, when playing. Hmmm...

No screenshot or audio sample today, everything is horribly broken! Boooh!