I have been working with a team of undergraduate students to design an interface for music production and sound synthesis using the HP’s immersive computing platform called the Sprout. We have been using the depth camera to identify and segment physical objects in the workspace, and then use the downward-facing projector to map animations onto the surface of the objects. This project is part of the applied research grant, Blended Reality, with is co-funded by HP and Yale ITS.
Documentation and video of my piece for laptop ensemble, performed by Sideband.
Piano improvisation and realtime granular sound file processing. This is a performance in Princeton’s Taplin Auditorium.
I had the opportunity to work with the dancer & choreographer Rebecca Stenn over a two-month period to create this new piece. It is for piano and live electronics and solo dancer, and it incorporates improvisation in both mediums. It was premiered at the Joyce SoHo as part of the SONiC festival, and subsequently performed at the 92nd Street Y.