There are no efficient algorithms for many computer science problems. The simplest approach is to try all possible (or many) solutions until the desired outcome is reached. This is called Brute Force Method.
Andreas Nicolas Fischer created a Python script that creates arrangements of intersecting digital sculptures in front of a “frozen” cloth simulation, similar to a traditional still life, but with no physical constraints. The HDR image texture used to create the reflections on the genometry was taken at his studio, placing the virtual setup in the real world. Similar to Sol LeWitt´s wall drawings, the computer is executing commands with an inherent degree of randomness, thus removing the artist one step from the work and making it impossible to anticipate the exact outcome of the process.
The chosen compositions are then be printed, framed and shown at an exhibition at the LEAP [Lab for Electronic Arts and Performances] Berlin. The whole range of rendered images during the course of the project will be shown on a designated Tumblr online gallery analyzing the crowd´s preferences as well as a video projection in the exhibition space.
This project aims to investigate the use of automated systems and crowd-curation in contemporary art production. The (somewhat over-used term) art-factory can be reduced to a computer running autonomous software, which publishes to the internet.
What are the implications of this? What can be said about the criteria why certain works are favored? Is the work about the process or the final result? Can and should feedback by the crowd be incorporated in the creation of works?
The BFA X Schwarm VII project was a study observing the results of this automatic publishing of images created with generative software. It combines the Schwarm software and images from the Brute Force Method as base for the abstraction by the particles. Usually Andreas selects the images for a series which originates from a generative process beforehand, but this time the Processing sketch publishes everything it generates into his Tumblr automatically through IFTTT and DropBox.
The base geometry was modelled in Sculptris, then brought into Blender, where it was arranged in front of a virtual studio setup with a cloth simulation as backdrop. A Python script is changing the color, location, scale and rotation of each of the 3 parts in each frame. The lighting is partly image-based and uses an HDR taken with a spherical light probe at his atelier. This way his physical studio gets reflected the virtual setup. The compositing is also automated with the built-in Blender node-compositor.
Posted on: 11/06/2013