Notes on Live.JS at JSConf.Asia 2016

I had sketched some ideas for the Web Audio Hackday before flying to Singapore, but I really had to finish them before the actual day came. So I spent Friday doing an intense labour of testing, testing and more testing with the latest versions of the browsers that implement MediaRecorder. Since I was on the business of talking and promoting this API, I was determined to get more people to try it out, but that meant that I needed to be prepared for the “surprises” that working with new and upcoming APIs entail. Namely, that what worked yesterday might not work anymore, and what didn’t might have started to work without prior warning.

That said, my infinitely curious self would not resist flying to such a beautiful city as Singapore and not exploring a little bit of it: I squeezed some time out of the morning to visit the Marina Bay gardens, which again I found baffling (botanical gardens which are cooled down instead of heated!), then did a lot of work, and then later in the afternoon I went to the CSS Conf/JS Conf venue, as Live.JS was going to happen.

And “What is Live.JS?”, you might be wondering. Basically it’s a collective that is dedicated to making audio visual shows using JavaScript. JavaScript!!! It’s a bunch of people, but that doesn’t mean that you will always get the same people in each “show”. In this case, it was

  • Matt McKegg, which would play music using his own Loop Drop instrument for live performances. He built it using Google Web Apps initially, then migrated to Electron when GWA was starting to stagnate. He uses two Novation launchpad controllers, and the sound generation itself is via Web Audio. And it is really impressive, if you ask me!
  • Ruth John, who was VJing. Not sure what the name of her software is, but she makes heavy use of CSS variables (to animate stuff), SVG to draw elements on screen, and of course Web Audio’s Analyser nodes to modify the values of variables, in response to the music being played. Also, Web MIDI so she can use external MIDI controllers to do things such as change layer opacities and switching between effects at the turn of a knob, etc.
  • Martin Schuhfuss, who was controlling the lights in the venue using the DMX512 protocol (which happens to be a close MIDI relative) and a Monumental Hack with CSS property interpolation to run light animations: colour, rotation, focus, etc. This was quite impressive to hear!

I personally I’m a big fan of Matt’s music since I saw him live in CampJS, so I was excited about this event. And of course, I was quite intrigued about the visual side of things. Hearing about the techniques underneath made me giggle and be in awe at the same time: “can’t this be done in a better, proper way, or are these hacks just used for the sheer pleasure of subverting the original purpose of the technology?”, I wondered.

In the meantime, “special cocktails” were served—although they were actually called “codetails” in the leaflet:

  • ES4: Old Fashioned
  • jQuery: Rum Punch
  • Outsourced: Curry and tonic
  • Unicode: Unicorn tears (non-alcoholic)

I had “a jQuery” as it seemed the sweetest of the punch, and chatted around with a bunch of people, then went and admired Matt’s total LoopDrop skills, and wondered again if I’d ever understand how it works for reals.

Then I focused my attention on what the lights were doing: many things. I guess my issue was exactly that: they were doing many things, and I was kind of expecting some sort of carefully built progression which would not happen, because the LiveJS people had just met a day or two ago, and Martin had only had access to the light equipment the same day, because obviously you don’t travel from Germany with a rack of professional light equipment. So there’s only so much you can prepare, and given that Matt’s music is so improvisational, I should probably not expect a heavily synchronised audiovisual show. Still, I couldn’t but admire two facts:

  • a browser was interpolating between CSS values and ultimately this was generating MIDI commands that caused spotlights to move and change colours.
  • that the DMX512 standard did indeed work well enough that he could just show up and control DMX512 lights with his software, without further changes

That was super cool!

People often fixate in the business value of JavaScript, but it’s important to also consider JavaScript as a creative medium which has lots of potential expressivity and also reach by virtue of being online. I’m glad that Live.JS exists to inspire and support this, and I’d love to see even more exploration in this field. Bring it on! 🤘🏼

Here’s the video of this event:

Berlin Web Audio Hack Day 2014

As with the Extensible Web Summit, we wrote some notes collaboratively. Here are the notes for the Web Audio Hackday!

We started the day with me being late because I took a series of badly timed bad decisions and that ended up in me taking the wrong untergrund lines. In short: I don’t know how to metro in Berlin in the mornings and I’m still so sorry.

I finally arrived to Soundcloud’s offices, and it was cool that Jan was still doing the presentations, so Tiffany gave me a giant glass of water and I almost drank it all while they finished. Then I set up my computer and proceeded to give my talk/workshop!

It was an improved and revised version of the beta-talk I gave at Mozilla London past past week:


Note to self: maybe remove red banners behind me if wearing a red shirt, so as not to blend with them
Continue reading “Berlin Web Audio Hack Day 2014”

Audio for the masses

The video above is from LXJS – the Lisbon JavaScript conference, which happened more than a month ago. I gave this talk past week again at VanJS, so I decided it was time for that belated write up on this talk.

If you want to follow along, or play with the examples, the slides are online and you can also check out the code for the slides.

As I’ve given this talk several times I keep changing bits of the content each time depending on what the audience seems more interested in, plus I also sometimes improvise stuff which I don’t remember when writing the final write up, so if you were at any of the talks and see that something’s missing or different now you know why! I’ve also added a section at the end with frequent questions I’ve been asked, hope that’s useful for you too.

Continue reading “Audio for the masses”

EdgeConf London, Audio Tags, and Web MIDI

EdgeConf

I am going to be in the Web Components panel at EdgeConf London tomorrow (21th of March).

Being the perfectionist nit-picky person I am, and having never been in a panel of this type, I’m obsessed with getting ready and reading as much as I can on the subject. Still have a bunch of proposed questions to go through, but I hope I’ll do good. Wish me luck!

The conference is supposedly going to be streamed live and talks recorded and professionally subtitled, which is awesome because speech recognition and my command of English don’t seem to mix.

Audio Tags

Also, I forgot to post about my latest Mozilla Hacks article: Audio Tags: Web Components + Web Audio = ♥. It is a write up on my CascadiaJS 2013 talk with the same name, only with better looking audio tags, less jetlag, and a lot of editing–thank Angelina for that!

Web MIDI

Good things that came out of the article: somehow some people got to learn about my interest in Web Audio, then I showed my existing experiments and cried begged for Web MIDI in Firefox so I didn’t have to use node.js as intermediate actor. Then Kyle Machulis (also known as qDot) decided to take the Web MIDI API in Firefox bug. Which means that sometime soon we’ll have support for Web MIDI in Firefox. Which means that a little bit later we could have Web MIDI in Firefox OS and Firefox for Android. Which means… well, do you really need me to explain how cool that can be?

Imagine controlling synthesisers with your really portable device, or using MIDI input devices to control things in your phone… all in JavaScript? There is a huge range of portable audio devices specifically targeted at iPads: there is a market for portable audio creation. Why not making it easier?

But we need help. Kyle is also working on Gaia, and he can’t implement all himself. I’m glad the ever helpful Chris Wilson (one of my personal developer heros) is always keen to give advice (example), but we need more than that. Building Web MIDI support implies building a Hardware Access Layer (HAL for friends) between the browser ‘core’ and the MIDI interfaces for each operating system. We’re talking of at least three different HALs–Linux, Mac OS, Windows. And this is where you can step in and become another Firefox hero! If interested, ping me or Kyle or probably just ask in the bug.