Exploring how we influence and interact with our shared workspace. Through this exercise, we isolate and analyze the dynamic relationship between the design (our work), the tool (the computer), and the environment (CodeLab).
We share a common workspace: CodeLab. This dynamic environment fosters collaboration, creativity, and individuality, but it lacks strict boundaries. While this openness encourages freedom, it can also lead to noise and distraction, making it difficult to concentrate on individual tasks.
To address this shared experience, we conceptualized a gesture-controlled interactive design that embodies our behaviors and interactions within CodeLab. Using two simple hand gestures—an open palm and a closed fist—we aim to allow users to control the intensity of the auditory noise in their environment, as well as visualizing this space or noise as a point cloud.
This design serves as a metaphor for the balance between engagement and isolation in collaborative spaces.
CodeLab and visible interactions in the free space
The hand as the first point of contact.
Any part of the body or physical gesture could be used to represent or provoke interaction, but we focused on the "hand" and its gestures. The hand is both an extension of our feelings and the primary tool in creative processes. This makes it an ideal medium for exploring hybrid design strategies that bridge the physical and virtual.
Technical Implementation: We used TouchDesigner to create an interactive demonstration. The process involved the following steps:
Scanning the Environment:
Using the LiDAR scanners on our smartphones, we captured a 3D point cloud of CodeLab to represent the physical space virtually.
Recording Ambient Noise:
During a busy day at CodeLab—when the space was most active with diverse activities such as conversations, individual work, phone calls, and project assembly—we used royalty-free sounds that emulate the experience of a busy day.
Visualizing the Space:
The point cloud serves as a visual abstraction of our interaction with the space, allowing users to perceive the environment both spatially and conceptually. Manipulating it with noise adds up to the perception of noise and randomness that aligns with how we might feel when overwhelmed or overworked.
Integrating Audio:
The ambient noise was incorporated into the visualization, enabling simultaneous auditory and visual analysis of the space. This mirrors the natural way humans perceive their environments.
Designing Gesture Control:
The media pipe command in TouchDesigner allows us to read and capture the palm input, so the program will react to the two distinct hand gestures.
Closed Fist:
Represents a desire for focus, muffling the ambient noise and visually modifying the point cloud to reflect this interaction.
Open Palm:
Represents openness to chaos, letting users experience the true, unaltered dynamics of the environment.