This project aims at providing tools to develop interactions with a collection of prerecorded videos and reactive sounds. These tools allow the detection of similar frames among a collection of video, the detection of visitor’s behavior in a given setup, and the use of the latter information to trigger or govern video navigation and audio interaction. This project results from an ongoing collaboration with the media art project BorderLands by Christian Graupner.