Created by Niklas Roy and located at the Tschumi Pavilion in Groningen / The Netherlands, the installation contains 1000 black sponge balls, which are sucked through 150m of transparent pneumatic tubes at high speed. Visitors are invited to operate the machine with a touch sensor mounted on the pavilion’s front glass: They can change the direction of the airflow and watch the balls speed up, slow down and reverse.
Inspired by the particle accelerators such as CERN’s well known “Large Hadron Collider” (LHC), the installation attempts to uncover the incomprehensible nature of these gigantic machines. Some 27km in length, they are designed for observation and there is not much to see since of course particles are too tiny to be seen with the bare eye. When Niklas was approached to construct a machine inside the pavilion, he wanted to create something that general public could enjoy – a particle accelerator for the masses.
The movement of the particles (42mm diameter) from one bubble to another is created by a difference in air pressure. When the vacuum cleaner is sucking air out of one bubble, using the power of an ordinary vacuum cleaner, it lowers the air pressure inside this bubble, which will be equalised immediately by the incoming air from the other bubble. This creates an airflow between the bubbles, which entrains the particles. The machine is switched on by the people who pass the pavilion using a touch sensor installed at the front glass of the pavilion. Once the accelerator is running, the airflow can be reversed by touching the sensor again. When the balls race through the pipes it is almost impossible to follow them. But when you reverse the direction of the airflow, hundreds of balls slow down all at the same time, just to speed up in the other way one moment later.
The installation runs using an Arduino Ethernet. The sensor part runs separately on an Atmega8, programmed in plain C. In addition, the installation also includes internet access and data logging where the working of the installation can be checked from remote location. Although quite cryptic and aimed at the one who understands it, you can find it here and the repository can be accessed on bitbucket.
For more details about the installation see the link to project page below.
The installation will run until end of September and it’s located here.
- Perpetual Energy Wasting Machine by Niklas Roy Created by Niklas Roy, “Perpetual Energy Wasting Machine” is a rope and pulley mechanism, installed in the staircase of the WRO Art Center in Wroclaw, Poland. The mechanism connects the sliding doors of the elevator in one floor with the elevator call button on another floor. Operating in two directions on the first and on the second floor, the contraption automatically moves the elevator cabin in an infinite loop between those two levels. Inspired by what Niklas learned at school that energy cannot be wasted - it can only change its location within a system, the projects attempts to address what happens to the energy in this elevator system? Where does it go? The setup is powered by the sliding doors of the elevator. The pulley blocks reduce the sliding door distance of 50 cm to a traveling distance of 12.5 cm for the push button mechanism, while also connecting first and second floor of the building. The “Perpetual Energy Wasting Machine” moves the elevator in recurring cycles. In the first half cycle, the elevator is lifted up one floor, while the latter half cycle brings the elevator down to its original position again. As this is an hydraulic elevator, and as the cabin´s mass is not equalized by a counterweight, only the movement up consumes electricity. Estimating that the empty elevator cabin has a mass of 350 Kilograms, the wasted energy is about 11.8 Kilojoules per cycle (which equals to the metabolic energy of ca. 1/3 grams of fat, according to Wolfram Alpha ). A modified printing calculator inside the elevator cabin keeps track of the wasted energy, automatically adding up 5.9 Kilojoules for each half cycle. The results of this symbolic calculation – which does neither regard energy loss by friction, nor a heavier cabin due to possible passengers – go straight to a waste bin, located beneath the printing calculator. The hardware in the staircase includes only ropes and pulleys. Inside the elevator, there is a printing calculator which is controlled by an ATmega8 via a relay board, programmed in AVR-GCC. The relay board basically 'pushes' the buttons of the calculator, performing the calculation "5.9+=" seven seconds after the door has closed. Project Page The installation was produced during a residency at WRO Art Center, funded by Goethe Institute Cracow. It is now part of the collection of the WRO Art Center. Previously on CAN: Lumenoise [Objects], PING! Augmented Pixel [Tutorials, Games] and My little piece of Privacy […]
- Ninety Six – Inflatable pixels by Nils Völker Ninety Six is a site specific installation comprised of 96 plastic bags that are selectively inflated and deflated in controlled rhythms, creating wavelike animations across the wall of the […]
- Shedding Light on Squidsoup – A Conversation with Anthony Rowe For more than a decade, the artist collective Squidsoup have been designing rich interactive experiences. From their early navigable sonic environments, through their playful experiments with computer vision and interest in 'volumetric visualizations', an email exchange between Squidsoup's Anthony Rowe and CAN begat a mammoth interview abound light, sound and many of the collective's […]
- Lumenoise [Objects] Developed during the few days residency at La Gaîté Lyrique, Lumenoise is a project by Niklas Roy that enables you to turn your old CRT-TV into an audiovisual synthesizer. Using a specially devised pen, you paint abstract geometric patterns and sounds directly onto the screen. Niklas calls it a playful and performative device, as anything that you do will cause an instantaneous reflection in the gadget’s sonic and visual output. I was always fascinated by light pens as I think they have a somewhat magical touch. Today it is normal to interact with devices by touching onto their screen. But old CRT screens didn't have a touch sensitive surface – and still, due to the particular way a raster scan tube draws an image, it is quite easy to find out the location of a little photo transistor on its surface. Engineers have found out about this simplicity very early and implemented the first light pens already during the middle of the last century. If you're interested in computer history, watch this fascinating 1956 IBM video about the SAGE project. At minute 4:30 you can see a light gun (the light pen's predecessor) in action. Furthermore, the film features beautiful background music, which you deserve after watching the Lumenoise screencapture. Niklas explains that unlike modern flat TV’s, old school CRT’s draw the image line by line onto their phosphorescent screen. A photo transistor, placed on a tube TV’s surface, can recognize when the part of the image is drawn underneath it. The transistor is connected with a micro controller which generates the video signal, the controller can localize the exact position of the photo transistor on the screen. Photo transistors – however – have a very high resolution in time. They respond very quick to minimal changes of brightness. Placed on the surface of a CRT-TV, a photo transistor can recognise the exact moment when the beam passes at its location. It changes its conductivity according to brightness changes, so with a little voltage divider circuit, you can read the voltage peak that the electron beam causes on a digital input pin of a micro controller. For making your own audio visual light pen synthesizer, Niklas has made all the code and schematics available for you to download. The circuit only requires a hand full of parts and is built around the ATmega8. Although Niklas succeeded squeezing all the components inside the pen, he recommends you not to do so and try to build the setup a bit larger on perfboard instead. The project is coded in AVR-GCC which makes it easy to hack and to develop further (the current code uses only half of the 8K memory). When you download the zip file with the codes, in addition to the final program you will also get some stable earlier versions of it. They are more lean and easier to understand, which makes them a good foundation for coding your own synth, which will produce different visuals and sounds. For more information on the project including code and schematics, see niklasroy.com/project/116/Lumenoise See also on CAN: PING! Augmented Pixel, an intelligent curtain and "My Lttle Piece of Privacy" Ver 2.0 named "Big Brother", all by Niklas […]
- PING! Augmented Pixel [Tutorials, Games] Augmented reality video game - by Niklas Roy (2011) In the decade where videogames were born, everything virtual looked like rectangular blocks. From today’s perspective, the representation of a tennis court in the earliest videogames is hard to distinguish from a soccer or a basketball field. ‘PING! – Augmented Pixel’ is a seventies style videogame, that adds a layer of digital information and oldschool aesthetics to a video signal: A classic rectangular video game ball moves across a video image. Whenever the ball hits something dark, it bounces off. The game itself has no rules and no goal. Like GTA, it provides a free environment in which anything is possible. And like Sony’s Eyetoy, it uses a video camera as game controller. What I found interesting when I developed this game, is, that it could have been made already in the seventies. The technology that I used for it is (in a way) similar to what Atari used for the first Pong. It becomes even more awkward, if you think that the electronic components for capturing and evaluating a video signal are cheaper than the rotary game controllers that Atari used. But still, from an economic point of view it makes sense that Eyetoys weren’t the ultimate controllers of thirty-something years ago, as a video camera was probably very hard to afford back in the days. For those who want to know how it works: The game is programed with AVR-GCC on an ATmega8 microcontroller that runs with 16MHz. The controller gets basic videosignal synchronisation information from an LM1881 sync separator that triggers two hardware interrupts. One for a new image, the other one for a new line. The controller evaluates the brightness around the pixel (/ball) via its comparator input. Drawing the white image overlay is realized with a simple pull-up resistor in the signal line. More (hires) images can be found here. C source code can be found here. Since several people in the Interwebz told me that they want to rebuild the game, here are some more photos of it. I also uploaded the schematics, the perfboard layout and a parts list in this pdf. Case closed. And opened. Nothing special here: There’s a 9V battery, a little circuit board and all the switches. A closer look on the circuit board. And the board seen from the bottom. Remember: You can find drawings of this all here. Here’s a trick: The power supply lines of the Atmega8 are criss-crossed. I always go with the +5V wire over the IC socket in order to connect the left and the right power pins. That avoids a messy circuit design on the solder side of the board. When you want to build your own “PING!”, you have to program the microcontroller. I use this cheap ISP programmer that I bought here. If you just get started and want to know in detail how to program an AVR – here’s a great tutorial. Sorry folks, this link is German only. But Google will also find you a good English source. (Here’s a hint: Once you’re the proud owner of an ISP programmer, you will likely never spend money anymore on Arduinos, as you can make them yourself, then. But there’s another thing: Once you’re familiar with programming your Atmel using straight AVR-GCC, you’ll probably never want to use an Arduino anymore.) The Fusebit settings for the controller are clear: Just tell the controller to use the external crystal as clock. (SUT_CKSEL) For the case, I used a U-shaped extruded aluminum profile. The different metal parts are all built out of the same profile. And a look from the other side. That’s it so far about the hardware. Concerning the firmware: Be aware, that the program is for PAL video systems only. If you want to play this game on your NTSC system, you have to adapt the timing parts of the source code. A big congratulations goes out to the first one who posts a link to a functionning code for NTSC video signals in the comments. I’ll link it here with name and credits and all. To help you understand both, the source code and the schematics, I’ve explained it already in the comments on Hackaday: … here’s how it works: Drawing onto the signal/image: One output of the AVR is connected via a 1K resistor to the video signal. Switching this output to HIGH rises the signal about a few mV -> the image becomes brighter, then. Digitizing the video image: Doesn’t happen. Instead, only the brightness of the area around the pixel is captured. Imagine a grid of 3×3 squares. The square in the middle is the pixel. When the signal is at this middle position, the output that draws on the video is switched to HIGH. If the signal is within one of the eight areas surrounding that pixel, the AVR compares the specific brightness (Voltage level) of that area with a threshold. That’s how an obstacle and its position in relation to the position of the pixel is detected. Calculating the animation: When the beam (or signal) has finished drawing the lower white bar, there’s plenty of time to calculate the new position of the pixel until the next image has to be drawn. As it is all synced with the video signal, this animation happens smooth. Couldn’t happen more smooth. An animation ‘pixel by pixel’ is also no problem, as it is all about counting video lines (y) or delaying within a specific video line (x). This also explains why the starting animation is rendered smooth. And it also means, that there is a delay until the pixel reacts: It reacts in the next image that is drawn. No magic here. Speed of the processor: The AVR is clocked with a 16MHz quarz. The duration (only image content) of one PAL video line is 64uS. => There are 1024 clock cycles per visible part of each video line. That’s really sufficient for what the program has to do: Which is mainly waiting, a bit of counting and sometimes reading the internal comparator bit or switching an output. Ok, that’s enough of information. With all those tips, you should be able to build your “PING!”, now. Send me some photos of it, when you’re done! -- This post first appeared on Niklas Roy's website - PING! Augmented Pixel Niklas Roy is a Berlin based artist. He mainly works on mechatronic installations and devices. From time to time he carries out performances in which his inventions play a central role. Roy really doesn’t like to write about himself in the third person but he "is one of the most facetious characters of the 'new media art' […]
- Frieze magazine talks to Julius von Bismarck – CERN’s Artist in Residence We've heard a lot about Higgs Boson particle and CERN in the last few weeks but very little about a very interesting programme running currently there with Julius von Bismarck as it's resident. Lunched earlier this year, the ‘Prix Ars Electronica Collide@CERN Digital Arts Prize’ was awarded to Julius in conjunction with a two-month residency at CERN. While everyone was busy talking to scientists trying to understand what they were on about, Frieze magazine's Barbara Preisig sat down with Julius in the cafeteria and asked "What colour is your Higgs particle?". von Bismarck has a small office on the campus, in one of the many shed-like storage buildings. When I visit, CERN turns out to have little in common with how I imagined it. The LHC lies 100 metres below ground and is currently out of bounds due to high radiation levels. So the artist spends his working hours, not in futuristic computer labs, but in the cafeteria, talking to theorists and experimental physicists. Their conversations revolve around concepts like dark matter, supersymmetry or hidden valley, and around questions that can no longer be framed in the imagination, let alone verified in reality. Such issues pose new challenges not only to particle physics but also to art. How can such abstract ideas be linked to the material world in real terms? Should we imagine a Higgs particle as having a colour and a shape? Read more on frieze-magazin Julius von Bismarck previously on CAN: The Space Beyond Me and Perpetual Storytelling […]
- Tele-Present Water [MaxMSP, Arduino] Created by David Bowen, Tele-Present Water installation draws information from the intensity and movement of the water in a remote location. Wave data is collected in real-time from National Oceanic and Atmospheric Administration (NOAA) data buoy station 46075 Shumagin Islands Alaska. The wave intensity and frequency is scaled and transferred to the mechanical grid structure resulting in a simulation of the physical effects caused by the movement of water from this distant location. The installation uses MAX/MSP to drive an Arduino mega running servo firmata. It uses 11 x 24volt dc motors with drivers for the movement. In May this year Tele-Present Water received one of three ex aequo awards in Alternative Now: The 14th Media Art Biennale WRO 2011, Wroclaw, Poland. //thanks for the tip Joost 11 x 24volt dc motors Photo by Alicja Kołodziejczyk - source Photo by Ewa Wójtowicz […]
- Particles [openFrameworks, Arduino, Events] Particles is the latest installation by Daito Manabe and Motoi Ishibashi currently on exhibit at the Yamaguchi Center for Arts and Media [YCAM]. The installation centers around a spiral-shaped rail construction on which a number of balls with built-in LEDs and xbee transmitters are rolling while blinking in different time intervals, resulting in spatial drawings of light particles. This is an art installation which is able to create a visionary beautiful dots pattern of blinking innumerable illuminations floating in all directions on the air. The number of balls with a built-in LED, pass through one after another on the rail “8-spiral shape.” We see this phenomenon like “the light particle float around” because the balls radiate in various timing. The openFrameworks application controls both the release of "particles" as well as their glow based on the information read within the application. The image below shows perlin noise being translated into particles, giving each one glow and position properties. The position of each ball is determined via total of 17 control points on the rail. Every time a ball passes through one of them, the respective ball’ s positional information is transmitted via a built-in infrared sensor. During the time the ball travels between one control points to the next, this position is calculated based on its average speed. The data for regulating the balls’ luminescence are divided by the control point segments and are switched every time a ball passes on a control point. The audiences can select a shape from several patterns floating in aerial space using an interface of the display. The activation of the virtual balls on the screen are determined by the timing which a ball moving on the rail passes through a certain check point on the rail and the speed which is calculated by using average speed values. The sound is generated from the ball positions and the information of LED flash pattern and is played through 8ch speakers. The board inside the ball is an Arduino compatible board based on the original design from Arduino Exhibition page: particles.ycam.jp/en/ Date & Time：March 5 (sat)−May 5 (thu) , 2011 10:00−19:00 Venue: Yamaguchi Center for Arts and Media [YCAM] Studio B Admission free Images courtesy of Yamaguchi Center for Arts and Media [YCAM] Photos: Ryuichi Maruo […]
Posted on: 03/07/2014
Posted in: Arduino
- Freelance Interactive Producers at Psyop
- Senior Digital Designer at CLEVER°FRANKE
- Interaction Designer at Carlo Ratti Associati
- Creative Technologist at Deeplocal
- HTML / CSS Developer at Resn
- 3D Technologist at INDG
- Creative Director at INDG
- Lead Developer at INDG
- Web Developer at &Associates
- Creative Technologist at Rewind FX
- Coder to collaborate with Agnes Chavez
- Data Scientist at Seed Scientific