This is a video of my mobile phone + laptop live improvisation at NYCEMF. My main focus this time round was to be able to convey the liveness of using the mobile phone. Cause and perceived effect are important in experimental electronics performance. I feel that too often the performative quality of live electronic music gets lost in the process of designing one’s own patch, instrument, setup, and composition. I wanted to have a robust set of options for controlling sound, but not in a way that required me to be focusing my eyes on my phone screen the whole time. In other words, I wanted to emulate what traditional instrumental performers such as clarinetists do, but with the mobile phone. How’d I do?
This is the latest attempt to make the mobile phone a robust instrument for manipulating sound live. Here I am playing one sound file, cialis but with multiple positions of that file available for playback depending on how I rotate the phone about the x axis, here and then push on the screen.
In considering other instruments and the way sound is produced, when I play the guitar for example, my left hand must locate a given fret or collection of frets before I am able to make sound with my right hand. This particular mobile phone instrument behaves similarly.
This piece has been submitted and accepted for NYCEMF 2016, to be held June 13-19 in New York City.
This is a portion of a lecture I recently gave at Moorpark College. The premise is that the creation of music is fundamentally shaped by the technology of the instruments we use. Turntable artists would never have developed live remix techniques had there been an easier way to loop recordings. The full slide deck used in the lecture is provided below.
I occasionally write for the Fibonacci Fine Arts Digest, a print publication based in Salt Lake City. The digest is meant to promote the arts in the greater Mountain West. My first article came out Fall 2014 in Volume 1.2, and I’ve written for every issue since.
They decided to put one of my articles online, about the Moab Music Festival. It’s a decent read, but the festival is really worth knowing about – truly unique. Visit their site at www.moabmusicfest.org for more information.
Photo credit: Elizabeth Leslie
This is the slide deck for a lecture I gave at Moorpark College upon returning from ICMC in Athens. As my students are undergrads, capsule and are being introduced to OSC for the first time, this lecture was presented with very broad strokes to illustrate the motivations behind the poster/paper that was presented in Athens.
I really should write much more here, but for now I’ll display the poster that my brother kindly designed for my co-author, David Reeder, and myself. We got some solid feedback, we started a mailing list, and now our paper is posted on the github page of an interesting effort called OSC Query.
I also can’t resist posting a photo I took my last morning in Athens:
Dates: 7-10pm Mondays June 9, 16th, 30th
Location: Arroyo Vista Recreation Center, Moorpark, CA
Cost: $110 per person if we can get 10 students. $90 if we get 15 (8 currently confirmed).
Instructor: Nathan Bowen (look around my site to get to know me)
Session 1. Using simple objects to build complex stuff.
We will use the metro object and the random object to create complex polyrhythms and ambient textures, with MIDI messages that can easily port to Logic, Live, Cubase, or your sequencer of choice.
Session 2. Building a wobble synth and other FM synthesis topics
We will build a wobble bass synth from the ground up. We’ll also explore signal processing effects such as delay lines, flanging and chorus. We will also build a pretty powerful push button sampler. If there is strong interest, we can explore FFT as well.
Session 3. Mobile phone (or wiimote) as controller, augmenting acoustic instruments.
If you play the electric guitar, you’re probably familiar with effects pedals. Here we’ll treat Max as a customizable effects pedal, taking audio from the guitar, routing it through Max, and affect sound based on accelerometer data from the phone (or wiimote). The idea is to apply the concepts from session 2 to here. Not that we’ll go quite this far, but take a look at this for inspiration.
Note: When you buy a spot via Paypal, please provide your email address. I will then send you the first week’s set of patches in advance. They’ll start with a simple patch, then another patch that adds a little more, moving step-by-step. You’ll be encouraged to tweak each patch as a means of understanding what’s going on.
As a result of my recent visit to UC Irvine, my dissertation is getting some use. Chapter 3, Mobile Phones in Music has been assigned reading in their Computer Audio and Music Programming class. It is taught by Christopher Dobrian and my colleague Kojiro Umezaki. I’m very pleased to have been invited, and am glad my work is beneficial to their coursework.
UC Irvine has recently offered a graduate degree in Integrated Composition Improvisation and Technology.
I’ve been thinking lately about how mobile phone interface design tends to be screen-centric. Here I attempted to use TouchOSC with no buttons or sliders, and simply tried to design a Max patch that would only leverage the phone’s accelerometer. I’m not so much interested in the sonic results here, but am definitely thinking about how much different the experience is when I do not need to look at the phone’s screen to make noise. More to come…
If anyone out there is interested in the patch, I’d be happy to share.