HALO is a large scale immersive artwork which embodies Semiconductor’s ongoing fascination with how we experience the materiality of nature through the lens of science and technology. Taking the form of a large cylinder (10m x 8m x 4.2m), the structure houses a 360-degree projection of ATLAS detector’s raw data while an array of 384 vertical wires are played by the same data, to produce the sound. The installation invites the viewer into its centre in order to inhabit the results of particle-collisions, produced by experiments taking place at CERN, in Geneva, Switzerland.
The physics performed at the ATLAS detector probes and enhances our current understanding of the building blocks of matter and their interactions, contributing to new theories that better describe our universe. As a part of the art-in-residence programme, Semiconductor were the first to have received permission to work directly with raw data generated by the experiment. By using this data, the artist duo seek to convey the signature of the technology, the mark of the architecture of the experiment, or the presence of man’s voice. They confront the viewer with the data before it has been processed for scientific consumption.
Each collision in ATLAS occurs at close to the speed of light. Semiconductor have re-animated 60 of these, slowing time down immeasurably to reveal time in the ordinarily static data. Through doing this the public are given space to analyse the mass of data. They naturally look for and see patterns in the data, and are given a sense of the immense task at hand for the scientists, in capturing, reading and processing the data.
HALO is a self-supporting elliptical steel structure measuring 10 x 8 metre diameter and 4.2 metre high. The structure consists of two halves which stand alone and can be entered at either end by visitors. Each half of the structure is made up of 8 standalone modules which contain 24 strings each, 16 solenoids to trigger the strings and magnetic pick-ups. The signal captured by the pickup is resonated via a custom amplification circuit and played back through sound-box mounted speakers. Each module contains 8 dual digital potentiometers, 4 preamps, 2 stereo power amps and 4 speakers with custom circuit boards controlling each element of the circuit. The structure provides cable management for the networking and power requirements of all digital and analog components. 360-degree projection playback is streamed via 6x Brightsign HD223 networked players synchronised via Brightsign Master Slave architecture, networked via ethernet using 6x HD video projectors at 1080p each.
Audio sequence timing and midi playback and transmission is done using Ableton. Audio data comes from a midi note sequence generated from the video played back at 120bpm to keep it in synch with the video. The midi note sequence contains 16 separate midi channels each containing 16 midi notes filtered by a custom Max for Live patch which filters out midi note off data for tightly clustered short length midi notes. Solenoids custom resonator circuitry and hardware is controlled via custom software running on the Arduino platform.
The control software, created using Processing, controls initialisation of each channel and all corresponding modules while monitoring communication to and feedback from the installation. It also triggers and synchronises of all audio and video via solenoids, resonators and 6x Brightsign media players. Custom Arduino code for each module receives and actions data relating to solenoids hitting strings and controls the resonators based on velocity data. Arduino code also provides feedback to the central sketch to display each module is online, strings, pick-ups and resonators are online and any monitoring data ie. temperature.