This interactive installation is part of our ongoing studio research into creative collaboration between humans and AI. It invites participants to shape generative outputs live through drawing and movement.
Using an iPad for sketching or skeletal tracking via Kinect, users could influence image generation in real time. Inputs were processed in TouchDesigner using DotSimulates’ StreamDiffusion component, combining hand-drawn lines with text prompts to guide the AI. Adjustable parameters allowed each participant to control how closely the system followed their gestures, or how much creative variation the AI introduced.
The results were displayed on VisualX’s modular ultra high-definition iPoster system, highlighting the subtle negotiation between user intention and algorithmic interpretation. We’re interested in how these tools can enhance creative agency, and where the line between human and machine might be redrawn.
Client
Internal research project in collaboration with VisualX
Venue
VisualX
Team
Project Lead
Mariele Goeldner (Lightscape)
Producer
Brian Kenny (Lightscape)
AV Technicians
Maurice Veale
Jack Dempsey
Peter Dempsey