top of page
Membrane

Membrane mimics an intelligent, flexible, sensitive surface that changes its properties if stretched.

It is a thin, skin-like latex film backlit by a projector, and its appearance changes based on the tension it feels when pushed inwards.

PROCESS

Concept and Design

Membrane possesses a tactile quality akin to skin. It serves as a bridge between humans and machines, mimicking the behavior and sensation of an organism. When pushed, it provides an interaction that feels alive and organic. My goal was to prompt people to recognize the naturalness of future machines.​

The configuration involves using a Kinect sensor to obtain a depth map, which functions as an alpha mask overlaying a visual, whether an image or a video. This technique enables the visual to appear as if emerging from nothingness. I achieved this prototype by positioning and meticulously aligning a projector behind the fabric.

Interactive Installation

Membrane 2, Boston Cyberarts Gallery, 2024

Touchdesigner

Touchdesigner patch for Membrane, 2024

Isadora Patch

Isadora patch for Membrane, 2024

Prototype 1

This project started as an elective class project at MIT. I took a class (Interaction Design and Projection for Live Performance) in Fall 2023 with Joshua Higgason, a technical instructor and teacher at MIT Music and Theater Arts Department. ​

In the class, I learned to use Isadora, an interaction design flow-based programming software I used for the project. I also used TouchDesigner, a visual programming language that allows users to create real-time interactive multimedia content.

I used the 3D map from the Kinect sensor and overlaid it with the visual I designed on Touchdesigner. After this, I calibrated the numbers in Isadora and gave it a specific range in space that only activated it when someone entered that range. 

Install Process, Membrane, 2024

Prototype 1, Membrane, 2024

Site and Install Process

The Stata Center hosted the final installation of Membrane 1 as part of a group exhibition featuring other MIT students’ works.​

 I borrowed a Kinect sensor from the DMI resources and used the built-in Lidar sensor to get a depth map. The camera on the Kinect takes a real-time three-dimensional map of its surroundings; hence, it knows when something is closer or farther away from it in space. I used this ability to track people’s distance from the sensor and realized there needed to be around 4 feet between the subject and the sensor. After calibrating the sensor, I placed the projector and calibrated it, too, so that the sensor and the projector could simultaneously take and project an image of the same size.

Once this was successful, I tested it using flexible fabric. After a few attempts and calibration, I was able to make it work.

Prototype 2

The second prototype needed to be better in terms of fabrication and the visual that came up after touching the skin. I planned to use a skin-like latex sheet for this purpose. I sourced latex sheeting from Canal Rubber, a trusted rubber supplier in New York City.

I learned much about latex cutting, gluing, and fabricating through YouTube. I installed the second prototype at the Fresh Media show in 2024. It resembled skin and worked precisely as it was supposed to. I received productive feedback, which encouraged me to dig deeper into designing more interactive, engaging machines. 

Setup, Prototype 2, Membrane, 2024

Preparing Latex, Prototype 2, Membrane, 2024

Project Documentation

Interactive Installation
Interactive Installation
Interactive Installation
Dynamic Media Institute
MIT Dome
bottom of page