I have been working with a team of undergraduate students to design an interface for music production and sound synthesis using the HP’s immersive computing platform called the Sprout. We have been using the depth camera to identify and segment physical objects in the workspace, and then use the downward-facing projector to map animations onto the surface of the objects. This project is part of the applied research grant, Blended Reality, with is co-funded by HP and Yale ITS.
Categories