Human Movement Installation

Using motion capture technology to create space-altering forms

Exploring the intersection of art and technology, I helped with design process overview and initial installation concept rendering. To create visual representations of human movement, I learned how to operate Axis Neuron software and Perception Neuron motion capture technology, and provided input on how modeled forms could be implemented into a physical space. 24 hour project intensive.

Award recipient at the 2017 Hacking Arts @ MIT event, located at the MIT Media Lab.

Project in collaboration with Eli Silver (RISD 2021), Jen Der (Northeastern 2020), Raymond Huang (Northeastern 2020), Hal Triedman (Brown University 2020). Our group of students with backgrounds in furniture design, engineering, computer science, and media arts chose to focus on the action of a cartwheel in order to explore forms.

Example action of a cartwheel while using Perception Neuron motion capture suit.

The same cartwheel realized digitally in the Axis Neuron software.


Rendering done by Eli Silver of our bench design derived from the captured cartwheel data points.

Rendering done by Eli Silver of our bench design derived from the captured cartwheel data points.

Inspired by the work of Vladislav Petrovskiy, Geoffrey Mann, and Shinichi Maruyama, who use various mediums to capture bodies in motion. Struck by the lack of tools available to generate these sculptures procedurally, we sought to create our own tools to generate art. We wanted to use a familiar form, the human body, to make unfamiliar forms through technology that still feel organic, human, and active, elevating typically mundane activities to reach artistic value and physically altering a space to evoke intrigue. Our response to the 2017 MIT Hacking Arts event theme "Why Human" explores the collaboration of technology and the human form to alter the process of artistic creation, using the human body as our drawing utensil and our tangible world as our media.

Our scripts were written in Python with Rhino's Python scripting API, processing .csv files exported from Axis Neuron's Perception Neuron data. Our process included translating data points from motion capture technology into XYZ coordinates, which we were able to manipulate in a 3-D modeling software.


RenderingInSpace5.jpg
Cartwheel data points from Axis Neuron motion capture software imported into 3-D modeling software Rhinoceros using Python script.

Cartwheel data points from Axis Neuron motion capture software imported into 3-D modeling software Rhinoceros using Python script.

Refined cartwheel data points using simple extrusion tools.

Refined cartwheel data points using simple extrusion tools.

Quick Concept sketches for future movement-based installations.

Quick Concept sketches for future movement-based installations.

Partial Cross Section of Cartwheel data.

Partial Cross Section of Cartwheel data.