In the 2019 fall semester, AIR LAB hosted a workshop about creating interactive spaces with projections. For the workshop we made use of the Processing IDE and the lab’s ceiling-mounted, floor-facing Microsoft Kinect body tracker and Projector setup. After an introduction to the hardware, participants were provided with example code, that could later be used as templates to build upon, to create one’s own interactive experience. At the end of the workshop, participants presenter their interactive projections/experiences, using one’s own body tracking as a sensor input. For instance, one participant made a game where the user had to dodge a ball moving about in the frame. This sketch was then projected onto the floor, and users would have to run around to not have the ball hit their body. See a screenshot from this sketch below.

If you missed the workshop, or want to revisit the code and documentation, you can find all the workshop material on our Github page