Imagine a world where you can plug a digital controller into a computer, open a a browser, and start controlling a sophisticated algorithmically generated music environment with the touch of a button. With the web-midi api landing in chrome 43 this type of experience will soon be possible to everyone running a modern computer. This session will explore building a midi-controllable generative synthesizer LIVE using the web-midi api for controlling user input, and the web-audio api for synthesis.
The talk will explore the history of the Midi protocol, and follow the path to where it finally is being supported in browsers. It will introduce some basic music theory, what is a key, what is an octave, as well as some basic composition theory. It will introduce digital signal processing, the concept of a unit generator, how the web-audio api works, and how you can use it to build a basic synthesizer. This will all be presented in the form of a live coding session, building a working instrument from first principals (and a couple modules).
At the end of the session we will have a working instrument that can be used to make some music that we can all groove to!