Wednesday, September 05, 2018

The Possibilities of Web MIDI With TypeScript

If you’ve ever had any experience with music technology, or more specifically sequencers, keyboards or synthesisers, you will have come across MIDI (Musical Instrument Digital Interface). It’s used to send note and controller messages from musical devices, such as keyboards or sequencers which are used to play and record music, and devices that produce sounds, such as samplers or synthesizers. It’s pure control information, for example, “play a c# in the 3rd octave with a velocity of 85”, there’s no actual audio involved. It dates back to the early 1980s, when a group of musical instrument manufacturers such as Roland, Sequential Circuits, Oberheim, Yamaha and Korg got together to define the standard. It soon lead to a huge boom in low cost music production and the genesis of new musical styles. It’s no accident that rap and electronic dance music date from the mid to late 80’s.

Web MIDI is a new W3C specification for an API to allow browser applications to access MIDI input and output devices on the host machine. You can enumerate the devices, then choose to listen for MIDI messages, or format and send your own messages. It’s designed to allow applications to consume and emit MIDI information at the protocol level, so you receive and send the actual raw message bytes rather the API providing the means to play MIDI files using General MIDI for example. Don’t let this put you off though, the protocol is very simple to interpret as I’ll demonstrate later.

The potential for a large new class of browser based musical applications is huge. The obvious examples are things like browser based sequencers and drum machines emitting MIDI messages and synthesizers and samplers on the consuming side using Web Audio, another interesting new standard. But it goes much wider than that, the MIDI protocol is ideally suited to any real-time parameter control. It’s already widely used for lighting rigs and special effects in theatrical productions for example. Also because it’s such an established standard, there is all kinds of cheaply available hardware controller interfaces full of knobs and buttons. If you’ve got any application that requires physical control outside the range of keyboard/mouse/trackpad, it might be a solution. Imagine a browser based application that allowed you to turn knobs on a cheap MIDI controller to tweak the parameters of a mathematical visualisation, or some network based industrial controller, or even as new input for browser based games. The possibilities are endless.

image

I’m going to show a simple TypeScript example. I’m currently working on a TypeScript application that consumes MIDI and I couldn’t find much good example code so I’m hoping this might help. I’m using the type definitions from here: https://www.npmjs.com/package/@types/webmidi.

The entry point into the new API is a new method on navigator, requestMIDIAccess. This returns a Promise<MIDIAccess> that you can use to enumerate the input and output devices on the system. Here I’m just looking for input devices:

window.navigator.requestMIDIAccess()
    .then((midiAccess) => {
        console.log("MIDI Ready!");
        for(let entry of midiAccess.inputs) {
            console.log("MIDI input device: " + entry[1].id)
            entry[1].onmidimessage = onMidiMessage;
        }
    })
    .catch((error) => {
        console.log("Error accessing MIDI devices: " + error);
    });

I’ve bound my onMidiMessage function to the onmidimessage event on every input device. This is the simplest possible scenario, it would be better to provide an option to your user to choose the device they want to use. This allows us to process MIDI events as they arrive from MIDI devices.

MIDI events arrive as byte arrays with a length of 1 to 3 bytes. The first byte is always the ‘status’ byte. The four most significant bits are the status type. Here we’re only concerned with note on (9) and off (8) messages. The four least significant bytes tell us the MIDI channel. This allows up to 16 different devices, or voices to be controlled by a single controller device. If you ignore the channel, as we’re doing here, it’s known as OMNI mode. For note on/off messages, the second byte is the note number and the third is the velocity, or how loud we want the note to sound. The note number describes the frequency of the note using the classical western chromatic scale; good luck if you want to make Gamelan dance music! The notes go from C0 (around 8hz) to G11 (approx 12543hz). This is much wider than a grand piano keyboard and sufficient for the vast majority of applications. See the code for how to convert the note number to name and octave. See this page and the Wikipedia page for more details.

In this example we filter for on/off messages, then write the channel, note name, command type and velocity to the console:

let noteNames: string[] = ["C", "C#", "D", "D#", "E", "F", "F#", "G", "G#", "A", "A#", "B"];

function onMidiMessage(midiEvent: WebMidi.MIDIMessageEvent): void {
    let data: Uint8Array = midiEvent.data;
    if(data.length === 3) {
        // status is the first byte.
        let status = data[0];
        // command is the four most significant bits of the status byte.
        let command = status >>> 4;
        // channel 0-15 is the lower four bits.
        let channel = status & 0xF;

        console.log(`$Command: ${command.toString(16)}, Channel: ${channel.toString(16)}`);

        // just look at note on and note off messages.
        if(command === 0x9 || command === 0x8) {
            // note number is the second byte.
            let note = data[1];
            // velocity is the thrid byte.
            let velocity = data[2];

            let commandName = command === 0x9 ? "Note On " : "Note Off";

            // calculate octave and note name.
            let octave = Math.trunc(note / 12);
            let noteName = noteNames[note % 12];

            console.log(`${commandName} ${noteName}${octave} ${velocity}`);
        }
    }
}

Here’s the output. I’m using Vmpk (Virtual MIDI Piano Keyboard) to play the notes. You’ll also need a MIDI loopback device such as loopMIDI if you want to connect software devices, but it should be plug and play with a hardware controller:

image

So there we have it. MIDI is now very easy to integrate into a browser based application. I’ve demonstrated this with just a few lines of code. It opens up possibilities for a new class of software and not for just musical applications. It’s going to be very interesting to see what people do with it.

2 comments:

mmmveggies said...

I am writing a TS application as well and I really like https://github.com/djipco/webmidi for the simplicity - no type declarations in the module itself but the API is well documented e.g. for the `noteon` event http://djipco.github.io/webmidi/latest/classes/Input.html#event_noteon

Unknown said...

"It’s already widely used for lighting rigs and special effects in theatrical productions for example"

That's actually not quite right. While lighting control does sometimes use MIDI for connecting control surfaces to the lighting system, the protocol that's generally used by lighting is DMX512 over a 3 pin XLR cable(though these days things are moving toward ArtNET.