Tag Archives: midi

Speaking at OneShotLondon NodeConf

“Just turn it into a node module,” and other mantras Edna taught me

The story of leaving behind a random mix of Python + php + bash + makefile + Scons scripts to totally embrace using Node, modules, standard callbacks, browserify, and friends to build toys that bleep and bloop with MIDI, WebGL and Web Audio.

As you can maybe deduct, this might not be your average super expert node.js talk, but a story of learning with a non-appointed mentor and an spontaneous community, and improving and making the most out of node.js—and how it informed and shaped the rest of my coding philosophy, both inside and outside of Mozilla.

I must confess that I’m really humbled and flattered to be amongst this amazing line up of true node experts.

UUUUUUUHHH THE EXPECTATIONS!—feeling a bit like an impostor now.

Next next Saturday 19th of July. See you there? :-)

EdgeConf London, Audio Tags, and Web MIDI

EdgeConf

I am going to be in the Web Components panel at EdgeConf London tomorrow (21th of March).

Being the perfectionist nit-picky person I am, and having never been in a panel of this type, I’m obsessed with getting ready and reading as much as I can on the subject. Still have a bunch of proposed questions to go through, but I hope I’ll do good. Wish me luck!

The conference is supposedly going to be streamed live and talks recorded and professionally subtitled, which is awesome because speech recognition and my command of English don’t seem to mix.

Audio Tags

Also, I forgot to post about my latest Mozilla Hacks article: Audio Tags: Web Components + Web Audio = ♥. It is a write up on my CascadiaJS 2013 talk with the same name, only with better looking audio tags, less jetlag, and a lot of editing–thank Angelina for that!

Web MIDI

Good things that came out of the article: somehow some people got to learn about my interest in Web Audio, then I showed my existing experiments and cried begged for Web MIDI in Firefox so I didn’t have to use node.js as intermediate actor. Then Kyle Machulis (also known as qDot) decided to take the Web MIDI API in Firefox bug. Which means that sometime soon we’ll have support for Web MIDI in Firefox. Which means that a little bit later we could have Web MIDI in Firefox OS and Firefox for Android. Which means… well, do you really need me to explain how cool that can be?

Imagine controlling synthesisers with your really portable device, or using MIDI input devices to control things in your phone… all in JavaScript? There is a huge range of portable audio devices specifically targeted at iPads: there is a market for portable audio creation. Why not making it easier?

But we need help. Kyle is also working on Gaia, and he can’t implement all himself. I’m glad the ever helpful Chris Wilson (one of my personal developer heros) is always keen to give advice (example), but we need more than that. Building Web MIDI support implies building a Hardware Access Layer (HAL for friends) between the browser ‘core’ and the MIDI interfaces for each operating system. We’re talking of at least three different HALs–Linux, Mac OS, Windows. And this is where you can step in and become another Firefox hero! If interested, ping me or Kyle or probably just ask in the bug.

Breakpoint demolog, day 10: deltaFrames and me

Ahhh so it wasn’t going to be that easy peasy! The first time I pressed play (with the Amp envelope turned on and being applied to the output) I just got a crazy distorted sound. The Amp envelope was outputting values in the range of thousands. Why is this, this code used to work?

I checked everything line by line, method by method… all seems fine until I found that I’m getting bad values in the reported time and then interpolating linearly and using a time value which isn’t quite in the range you expect gets you wrong results (read: 7000′s instead of [0..1]).

In a way, it was something to be expected, since my approach was being very naive and I was not using properly the deltaFrames value that I get each time I receive a new MIDI event (i.e. note on, note off).

It seems that the way VST hosts and MIDI events work is this: the host always processes blocks of N frames length (where N is constant). MIDI events can happen at any point of time, so it is possible that a MIDI event shows up in the middle of one of those blocks. That’s what the deltaFrames parameter is for: specifying the offset in frames the received event is going to happen. When playing live with the synth, i.e. playing keys in the keyboard, it seems that deltaFrames is always 0. Which makes sense, because the host can’t anticipate when are you going to press or release the key.

Back to my problem: I was sending the note events to the synth as soon as I got them, in the processEvents function, instead of just storing the event and the deltaFrames somewhere, and getting back to them the next time a new chunk of audio data was requested. The result of this wrong approach was that the synth was starting and/or stopping to play before the right moment.

It looks very straightforward now but I have spent quite a lot of time tracking this little silly thing down!

In any case this has given me some ideas for packing the song data later (when I export it from Renoise). In other words, instead of using the classic tracker style, maybe it’s easier to export in a more MIDI-style way, by placing one event after the other, for all the channels in the song. The good thing of this approach might be that I could get to know when certain things (for example, instrument 2 playing note A-5) happened and at which exact point in time, since the events would already be ordered by ascending time so that they can be sent from the player to the synth voices in order, when playing. Hmmm…

No screenshot or audio sample today, everything is horribly broken! Boooh!

get mobile

And two songs from mine converted into midi/ringtones for your cellphone delight! Drumu and sardu asked me if I could convert them for their mobile phones and months after I have these ones. Enjoy pikapikapolka and hyperaktive dancing (from scene of the girls demo by ppg). Pocafeina rulez… By the way, I haven’t any poliphone to test this, so it might sound like a shit. You know… deaf editing ;)