Audio Visual Experience

Blooming Symphonies - Natural Soundscape

Artist Statement
Blooming Symphonies is an interactive installation that explores the intersection between technology and nature, inviting participants to engage with the natural world through a digital medium. Through the integration of artificial intelligence and sensory technology, this work creates a responsive environment where touch and sound converge to form an dialogue between human, machine, and nature. In an era where our relationship with the natural world is increasingly mediated through screens and digital interfaces, Blooming Symphonies invites us to consider how technology might not merely simulate nature, but create new pathways for experiencing its patterns and rhythms.
Tools
TouchDesigner, StreamDiffusion by DotSimulate, Playtron

Discipline

Full name

Platform

March 2023

Role

Role name

Overview

This interactive installation explores the relationship between natural materials and digital environments through touch-based and proximity-triggered interactions. Initially designed as a walkable garden with ultrasonic sensors activating soundscape prompts, the project evolved into a more intimate experience using real flowers and the Playtron MIDI device. By combining real-time visuals (via StreamDiffusion) and reactive soundscapes, the piece creates a multisensory dialogue between the user and nature, therefore blurring the line between the organic and the algorithmic.

The Process

Connecting the physical and digital

My project began with the idea of using ultrasonic distance sensors to create an interactive garden experience. Each sensor was tied to a specific prompt, triggered by the proximity of a user walking near physical flower installations. As users navigated the space, different soundscape scenes were activated based on how close they were to each flower. This setup encouraged movement and exploration, using space as the primary mode of interaction.

Visual Prompting

To explore further the sensory experience, I used StreamDiffusion into the system. When a prompt was triggered by the distance sensors, it activates a soundscape and generates a live AI-generated image tied to that environment. Additionally, the Noise CHOP in TouchDesigner was altered based on the selected soundscape, which influenced the motion and aesthetic of the visuals, adding a layer of organic variation in real time.

Generating Motion in Real Time

As the project developed, I reconsidered the physical interaction design. I wanted the experience to feel more intimate and sensory, so I moved away from a walkable installation and instead made the flowers tactile interfaces. This shift allowed for a closer connection between users and the natural elements, turning the experience from one of distant observation to direct interaction.

Using Real Flowers for Interaction

To achieve this, I incorporated a Playtron MIDI device, which allowed conductive materials (like real flowers) to act as input triggers. Touching a flower would now activate both the soundscape and the visual prompt in real time. Each flower corresponded to a specific scene, and the act of touching became the new interface. This final interaction model created a more personal and embodied experience.

Visuals

Learnings

This project helped me discover my creative voice and taught me how to manage a project from concept to installation. I explored integrating multiple sensory inputs, including ultrasonic sensors with Arduino and the Playtron MIDI for touch-based interactions.

While I faced roadblocks, such as shifting from a walkable installation to a touch-based one, I learned to adapt without losing the core vision. The experience reinforced my interest in blending physical materials with real-time digital feedback, and deepened my confidence in working across hardware and software systems.