How it should have gone: SEAMUS 2011 Presentation

November 28, 2010

How it should have gone: SEAMUS 2011 Presentation

Alright, so I just presented my latest and greatest research at a paper session at SEAMUS2011 in Miami. Things went very well, but I presented all of the nitty gritty details and technical points about what I was doing. At NIME or even ICMC, the audience would have taken that and ran with the possibilities. Unfortunately, I think I may have made the wrong presentation here at SEAMUS, and many may have missed the point. Here's what I probably should have added.

Distributed Performance Systems using HTML5 and Ruby on Rails.

  • What is this? A way to split up an instrument to most internet enabled devices. . .
  • Why would we do this? to let many many people collaborate on a real time piece of music.
  • Can't this already be done? Sure. if you want to learn how to code and adapt your interface for every platform that wish to support - Mac, Windows, Linux, iOS, Android, Blackberry, Symbian, PalmOS, Windows Mobile. Sound easy? Not to mention all of the hassle of porting graphical user interfaces like sliders and buttons and waveform selection in a different graphics framework on each device. Hah! Then support it!
  • So why do it your way? Three reasons: the browser is one of the very few cross platform standards projects that can support this kind of interaction, HTML5 has proposed some new tags that make the creation of custom graphical user interfaces possible, and Ruby on Rails is an excellent and very quick way of making sustainable web applications that can handle the distribution and coordination of the interface.

    Envision this. . . As you look across the stadium at the next Superbowl or Olympics, everyone reaches into their pockets and purses. They pull out their mobile devices and open up a web page. At this point, each person is given a unique set of graphical controls - they just appear on the screen regardless of device or platform. Yes, the iPhones are friends with the android phones, blackberries, palmOS, iPads, and laptops. All is right with the world. . .

    Timidly at first, but with greater and greater zeal, people touch their screens. Sound comes out from over the loudspeakers. Each person initiates and has an effect on the audio for the entire stadium. As people start to get what is happening and the performance builds to a deafening roar with individual inputs, the lights and sound goes out. In the dim colorful glow of each device, the centralized sound gives way to a chirping sound coming from the individual device. A chirping around the entire stadium becomes evident like a wash of cicadas or tree frogs. You look up across the stadium to your fellow performers and start to see colorful patterns emerge. The individual screens are used like pixels lighting up the faces of their holders. The patterns become slow motion video, coordinated by a server in the cloud. Each performer/audience member is overcome by the collaborative experience of such an event and vows to support open source projects of experimental music and digital media for the remainder of their collectively enriched lives.

    This experience is possible. I'm working on a piece as you read this. HTML5/Rails can solve the issues of cross platform GUIs, coordination and distribution of instruments, and communication to and from many instrument nodes all over the world. All kinds of net art can become inclusive and performative.

    Are you excited? I am.

    Superbowl Coordinators, Olympic Committee. . . I'm waiting!

    Distributed Performance with HTML5 and Rails

  • Posts