Latest in the series of experiments and explorations into neural networks by Memo Akten is a pre-trained deep neural network able to make predictions on live camera input – trying to make sense of what it sees, in context of what it’s seen before.
“Three Pieces with Titles” is the latest audiovisual performance by Montreal’s artificiel. In it Alexandre Burton and Julien Roy manipulate an eclectic collection of objects within the field of view of a computer vision system to generate real-time video and abstract sonic collage.
Guillaume Massol’s openFrameworks app titled “All work and no play” watches videos coming from different training datasets and generates sentences loosely based on what is happening on the screen, sometimes creating pearls of wisdom by coincidence.
A project by Design I/O for TIFF Kids International Film Festival’s interactive playground digiPlaySpace, Mimic brings a UR5 robotic arm to life and imbues it with personality. Playfully craning its neck to get a better look, arcing back when it is startled – it responds to each child that enters its field of view.
Created by Matthias Grund, Kadir Inan and Wookseob Jeong at the Köln International School of Design, >200 °C is imagined as a closed feedback system that combines computer vision with a poetic perspective of the physical occurrence called the Leidenfrost effect.
Created as a collaboration between Prokop Bartoníček and Benjamin Maus, Jller is part of an their ongoing research in the field of industrial automation and historical geology. Installation includes an apparatus, that sorts pebbles from a specific river by their geologic age.
Created by Brad Todd, Collimation takes a form of basic form of artificial intelligence, where the visual stimuli is translated, in a performative act of seeing with the resulting data that takes the form of a neuron.
Created by Jamie Zigelbaum, Triangular Series is a site-specific lighting installation composed of numerous, truncated tetrahedral forms. Each object has an unique form and senses the other and the physiological rhythms of visitors beneath them.
Created by Adam Ben-Dror, The Abovemarine is a vehicle that enables José, or any other fish to roam on the land freely
Created by Kyle McDonald, “Sharing Faces” uses a megapixel surveillance camera and custom software to match the face locations of the persons looking at the screen. As the person moves, new images are pulled from the database matching the new location and create a mirror-like image of yourself using the images of others.
Cloud Piano plays the keys of a piano based on the movements and shapes of the clouds. A camera pointed at the sky captures video of the clouds. MaxMSP patch uses the video of the clouds in real-time to drive a robotic device that presses the corresponding keys on the piano.
Google’s got a new consumer hardware initiative is a mobile phone with machine vision eyes, ultra-fast inner-ears and spatially aware brains. And around that 5″ Android reference hardware, could this be all your AR Kickstarters come true?