Objectifier – Device to train domestic objects

Created by Bjørn Karmann at CIID, Objectifier empowers people to train objects in their daily environment to respond to their unique behaviours. Interacting with Objectifier is much like training a dog – you teach it only what you want it to care about. Just like a dog, it sees and understands its environment.

23/01/2017
Project Soli – World’s Tiniest Violin

Created by Design I/O, World’s Tiniest Violin is a ‘speed project’ that uses Google’s Project Soli – Alpha Dev Kit combined with the Wekinator machine learning tool and openFrameworks to detect small movements that look like someone playing a tiny violin and translate that to the volume and playback of a violin solo.

10/06/2016

Created by Michael Sedbon, Alt-C is an installation that uses electricity produced by plants to power a single board computer mining a cryptocurrency. The project questions our relationship to ecosystems in regards to networked technologies and abstraction problematics.

Created by Tore Knudsen, ‘Pour Reception’ is a playful radio that uses machine learning and tangible computing to challenge our cultural understanding of what an interface is and can be. Two glasses of water are turned into a digital material for the user to explore and appropriate.

Created by Benedict Hubener, Stephanie Lee and Kelvyn Marte at the CIID with the help from Andreas Refsgaard and Gene Kogan, ‘The Classyfier’ is a table that detects the beverages people consume around it and chooses music that fits the situation accordingly.

Created by Bjørn Karmann at CIID, Objectifier empowers people to train objects in their daily environment to respond to their unique behaviours. Interacting with Objectifier is much like training a dog – you teach it only what you want it to care about. Just like a dog, it sees and understands its environment.

Created by Design I/O, World’s Tiniest Violin is a ‘speed project’ that uses Google’s Project Soli – Alpha Dev Kit combined with the Wekinator machine learning tool and openFrameworks to detect small movements that look like someone playing a tiny violin and translate that to the volume and playback of a violin solo.

The following is a documentation of a new course ran by Gene Kogan on Machine Learning for Artists at ITP-NYU in spring 2016.

Category: Members / News
Tags: /