| abstract
| exhibition

The project MediaFlies implements an interactive multi-agent system that incorporates flocking and synchronization in order to generate a constantly changing visual and acoustic output. It relies on prerecorded or life video and audio material, which is fragmented and recombined via the agents’ activities. Video fragments stem from a video frame ring buffer which is continously updated. Audio material is also stored in a ring buffer and reorganized via granular synthesis. Agents engage in synchronization by adapting texture and grain properties in an attempt to recreate the original media material. The success of synchronization depends on the flocks coherence and velocity. Interaction is based on video tracking. It allows users to influence the flocks behavior by attracting or dispersing agents and thereby affects the balance between disturbance and recognizability of the systems audio and video feedback. The project draws its inspiration from the biological phenomena of flocking and synchronization. Simulations of these phenomena form the basis for the generative behavior of MediaFlies.

This project has been realized as a collaboration:

Daniel Bisig: Swarm Simulation and Visuals
Tatsuo Unemi: Technical and Conceptual Support