In the past few months Marcin Ignac has been exploring Space colonization, an algorithm simulating growth of plants. Marcin explains the algorithm as 6 “simple” steps:
1. Populate possible growth space with growth hormons.
2. For each hormone find the nearest bud.
3. Grow each bud towards it's neighbor hormons. If there is none, kill the bud.
4. If a hormon is too close to a bud, remove it.
5. With every growth branch the buds with some probability.
6. Repeat until all hormons are consumed or all buds are dead.
By manipulating parameters like bud’s field of view, branching probability and density of the growth hormons you can achieve some really amazing results. Recently Nervous System posted an animation explaining similar process.
His first implementation was done in Cinder and led to the Crystal Infection project. Recently he has been playing with more subtle colors (yellow background examples) using Plask, programming environment created by Dean McNamee. There have also been versions made using HTMl5, few I’ve seen during OFFF this summer but unfortunately none of them are on line except this latest experiment ported from Plask.
Part of the same series, Crystal Infection is a work in progress iPad application that visualizes the growth of a virtual plant combined with cold aesthetics of crystals.
“explorations on non-photorealistic simulations of natural phenomena”
Every time the algorithm starts a possible growth space is defined and during each iteration the plant tries to expand it’s branches to fill the most space available within the reach. For me the most interesting aspect of this algorithm is the ability to control the unpredictable. Opposite to L-Systems that always look symmetrical and synthetic this algorithm creates much more natural forms.
- “Every Day Of My Life” by Marcin Ignac Every Day Of My Life is a visualization of Marcin Ignac's computer usage statistics from the last 2.5 years and presented at the Click festival in Helsingør. Each line represents one day and each colorful block is the most foreground app running at the given moment. Black areas are periods when his computer is not turned on. Seeping patterns (or lack of them) and time of holidays and travel (longer gaps) can be therefore easily identified. Data is also separated by Keyboard hits (yellow) or Mouse clicks (red) and presented in single colour. All data was gathered using Tapper - a small OSX app logging application usage written by Dean McNamee and later visualized by Marcin using Plask environment. Project Page See also selfspy, a daemon for Unix/X11 and Mac OS X, that continuously monitors and stores what you are doing on your computer. (via @mariuswatz) […]
- Hyphae [Cinder, Objects] Hyphae is a new collection of jewellery from Nervous System (Jessica Rosenkrantz and Jesse Louis-Rosenberg). Inspired by the vein structures that carry fluids through organisms from the leaves of plants to our own circulatory systems, the team created a Cinder simulation which uses physical growth principles to build sculptural, organic structures. Starting from an initial seed and a surface, the team grows a hierarchical network where nodes constantly branch and merge. First and foremost is making the system work in 3D. The original system uses a Delaunay triangulation to determine source neighborhoods. This allows the creation of closed cells. Closed cells are not only desirable for us aesthetically, but necessary for improved structural stability when we are making 3D prints. Having a bunch of unconnected branches would be too weak for functional plastic pieces. Computing Delaunay triangulations becomes much harder in 3D, so we switched over to C++ in order to use CGAL, an open source library of computational geometry algorithms... Hyphae collection contains a range of necklaces, earrings, cuffs, bangles, rings and brooches. The pieces are 3D printed in nylon and are available in black and white. Select pieces will also be available in stainless steel and sterling silver. Read more about the process here or see the full collection here. Want to win a piece from their new collection? Visit this blog post and follow the link to tweet. (The giveaway runs until March 29th at 12am EST. Two entrants will be selected randomly to win their pick.) In addition, the duo recently retired 3 Cell Cycle bracelets and made available for download at thingiverse so now you can print them yourself. Nervous System was founded in 2007 by Jessica Rosenkrantz and Jesse Louis-Rosenberg. Jessica Rosenkrantz graduated from MIT in 2005 and holds degrees in Architecture and Biology. Afterwards, she spent 2.5 years studying architecture at the Harvard Graduate School of Design. Jesse Louis-Rosenberg also attended MIT, majoring in Mathematics. He previously worked as a consultant for Gehry Technologies in building modeling and design automation. Previously: Nervous System [Profile, Cinder, […]
- OFFF + CAN Workshop Collaborative 2011 [Cinder, oF, Js, Events] Earlier this year we have been thinking about the concept of "curated workshops", an opportunity to bring people together to work for a very short period of time and share their creations. These would include setting up a team, inviting few high profile individuals and opening up submissions for participation. When I was approached by Héctor Ayuso earlier this year to give a talk at OFFF, instead of talking about CAN, I thought this would be a great opportunity to do something more, a workshop, and use the workshop material as the content to drive the talk. Hector and I agreed, 'Workshop Collaborative' was born. What was the aim of “Workshop Collaborative”? 1. Initiate collaborations between those that share common interests. 2. Create a playing field, both physical and virtual. 3. Allow ideas to evolve by asking questions. When we announced the workshop back in January, we also opened to applications for participation. In total, 80 applications were submitted and 11 participants chosen by the team including Aaron Koblin, Ricardo Cabello - mr.doob, myself and Eduard Prats Molner. The participants included: Marek Bereza, Alba G. Corral, Andreas Nicolas Fischer, Martin Fuchs, Roger Pujol Gomez, Marcin Ignac, Rainer Kohlberger, Thomas Mann, Joshua Noble, Roger Pala and Philip Whitfield. Programme - Single Day 09:00 - 10:00 Introductions / Teams 10:00 - 13:30 Stage 1 13:30 - 14:00 Lunch 14:00 - 19:00 Stage 2 (Completion) Total creation time: 6.5 hours Few weeks before the workshop, Aaron and I decided four themes we should allow to influence the work we would be making. By allowing other participants to comment and feedback on these themes we would discover areas we all want to explore. The themes included: 1. Digital Ecosystem - Build an application, an organism of information, sound and visuals, a digital ecosystem that flows through different mediums and evolves. “living system - travelling through technology and mutates through tools. 2. Analogue Digital - Explores the notions of physicality in code. Using made objects as assets to code. Scan 3d objects, cut paper and cut-out, traditional 2d scans, 3d objects scanned using flatback scanners, etc.. 3. Projection Mapping - Address projection mapping conceptually. Moving away from technical demos, time to question what does it all mean; surface, source, angle, point projection, scale, form, interaction, animation. 4. Data re-embodied - Tell stories through the juxtaposition of data sources and their methods of representation. How can we create new meaning, understanding and value from reinterpretation of data. By no means this ment that we would have to choose one over the other. The purpose was to get the feel where the interest lies amongst the participants and set up, so to say, a 'playing field' and allow first ideas to develop. We knew that working together for a single day we would not be able to produce anything of "finish" quality, rather focus on the subjects themselves and see what comes out. Following the feedback, a number of keywords were derived, to summarise our interests: ecosystem, data, scan, evolution, input, mutation, osc, node, rhythm, pattern, touch, physical, language, viewport and mobility. Five projects developed during the 6.5 hours of work. These included Kinect > WebGL bridge, Kinect Image Evolved, Input Device, Data Flow and Receipt Racer. -- Kinect > WebGL This project was the work of mr.doob, Marcin and Edu although other people were involved also. The task was to create a bridge between the Kinect and browser, allowing the real time feed over the web. Although aspirations were much higher than the time allowed, instead of utilising node.js server - which I understand was 99% complete anyhow, the team setled for feeding downscaled image data from cinder application using standard http requests to the three.js script which was reading the images at about 10f/s. Several rendering styles are presented below. First one is just simple point cloud done by Marcin for debugging while the rest was done by mr.doob using his amazing Three.js engine. Download .js code here. -- Kinect Image Evolved Simultaneously while Ricardo was working on the .js part, Marcin was exploring different ways of kinect image representation. In attempt to get away from standard kinect point cloud, we developed idea of trying slitscan effect with the point cloud. What this means is that the kinect point cloud was dispersed along the time lapse, different bands representing different moment in time. Likewise, Macin also was exploring what happens if the point location was reversed when particular depth was reached. What you see in the videos below are both effects. Code available soon. Thomas and Andreas were also testing different tools to manipulate kinect image. Meshlab, Blender were used to pull kinect point clouds and convert them into meshes which could then be render, distorted, split, etc. -- Input Device Marcin was also working on ways to control the input, ie how one could interact with the Kinect point cloud. We were toying with the idea of being able to assign different devices over OSC to different kinect body parts. This would allow for each individual to be assigned unique element of te point cloud and to interact with it. The first step was to use simple gyroscope datam sent from an iPhone over OSC. The video below shows what is happening. Likewise, Rainer and Roger were working on the iPhone application that would send the OSC data. Rather than just utilising gyro or accelerometer, Rainer was exploring different forms of interaction with the device, seeing whether a language could be evolved, one that would somehow enhance emotional attachment the kinect body parts. The videos below show and instrument like application that also has audio feedback. Code available soon. -- Data Flow With all the data moving, Marek was wondering if the input and output are in same medium, you can compare them, apples for apples, what would happen. Marek looked at the process of the loop by examining the image obtained by subtracting the initial input from the output so we're just left with the parts that change. For the loop algorithm, jpeg compression was chosen because it was easily available in oF and ubiquitous enough to warrant investigation. The boxy images are a result of feeding the jpeg "high" quality compression back into itself and subtracting it from the original. The finer images are using the "best" compression setting. Then Marek tried the same thing with sound (using logic), using first the original sound, then the encoded and seeing what is left. You can hear all the sounds below. Original / OFFFCAN Workshop Collaborative by filipvisnjic Encoded / OFFFCAN Workshop Collaborative by filipvisnjic Difference / OFFFCAN Workshop Collaborative by filipvisnjic Code available soon. -- Receipt Racer The receipt racer combines different in and output devices into a complete game. It was made by Martin, Philip and Joshua utilising a receipt printer, a common device you can see at every convenient store, small projector, sony ps controller and a mac running custom openFrameworks application. Print is a static medium, that's why, Philip, Martin and Josh explain, it was an intriguing challenge to create an interactive game with it. First the team tried to do it only with the printer as the visual representation but that seemed rather impossible. But then Joshua Noble came up with a small projector, perfect to project a car onto a preprinted road. There is no game without an input device. So they were lucky enough as at least one of them always carries a gamepad around. The cables connect back to the laptop running an openframeworks application the team wrote parts of. The app was entirely programmed during the workshop. Internally it runs something like the basic js game. Only a car driving on a randomly generated race track. Then it broadcasts its components to the external devices, prints the street and guesses where the car's projection is supposed to be to perform the hit test. That's the trickiest part. Everything has to be in sync and needs some calibration in the beginning. The paper also has a little bit of a mind of it's own and tends to slide around or curl. But that's nothing some duct tape and cardboard can't fix. It was a lucky day. Somehow everything was just lying around, waiting to be used. Even the stand and this plastic thing you would normally use to put in your name on a conference. Even the timing was perfect. Right at the end of the workshop we finished adding the details like a little score and the YOU CRASHED TEXTS. Project Page (code available) -- On Saturday we presented the creations. Regardless of the fact that Erik Spiekermann was presenting in the other OFFF room, we had a full theatre (500 ppl estimate) including another room where our talk could be watched on a large screen. Photo above by Arseny Vesnin CAN would like to thank all the participants at the workshop as well as Aaron and Ricardo for taking time off their busy schedules to take part of the workshop. For more information on the workshop and all future information/code/links see creativeapplications.net/offf2011 Photos by Jason Vancleave We leave you with OFFF Barcelona 2011 Main Titles made for OFFF by PostPanic (full screen […]
- Cindermedusae [Cinder] Created by Marcin Ignac and yet another project selected for the WrittenImages book, Cindermedusae is a generative encyclopedia of imaginary sea creatures. I wanted to explore generating organic and believable forms so I have chosen to try with a jellyfish. I have been following their creation since the beginning and really enjoyed object evolution, from the first sketches to the end result (see images below). The base for the whole creature is the head made out of deformed sphere. All the elements are controlled by a set of parameters such as length and number of features that can be randomized and animated over time. No predefined geometry or textures are used. Marcin writes: Recently I was working on a project about underwater life. In this case we used 3d models so the immediately when I heard about Written Images I thought "Let's make something more generative and organic". I did some research and was amazed how big jellyfish can grow so I decided to make one. At the beginning I was aiming for super realistic look but after stumbling upon works by Ernst Haeckel and his amazing book "Kunstformen der Natur" I knew that this is the way to go. The most difficult part of the project was to find a way of controlling the layout on the page because when you generate something randomly it's hard to predict it's shape, size and position. I dealt with that with some smart transformations and iterative algorithms. Created using OpenGL, GLSL and Cinder. Project Page Marcin Ignac is a polish artist / programmer / designer living in Copenhagen, Denmark. Previously on CAN: Dynamic Mesh Triangulation + UI [Cinder, […]
- Dynamic Mesh Triangulation + UI [Cinder, iPad] First wave of Cinder built applications are starting to appear and boy are they wonderful. Posted on the Cinder forums are dynamic mesh experiments by Marcin Ignac, a Copenhagen based designer and programmer. Part of the Shiftcontrol collective, Marcin has been playing with Cinder mesh deforming as well as coding his own GUI based on the ControlP5 library for Processing. For triangulation he used poly2tri, a 2D constrained Delaunay triangulation library. He writes: The app is gonna be interactive and right now I can seamlessly switch image source between picture, video and camera capture. I love Cinder's file drag'n'drop support! Additionally there is some basic diffuse shading and all the textures can be displaced to simulate environmental mapping. In addition to the above, Marcin has made available wonderful code for mapping projection onto cubes for Processing. Check it out here. I have also embedded below few movies from Shiftcontrol's vimeo account which I can only tell have been made using 'openGL' but could have been made using either Cinder or openFrameworks libraries. If you are unfamiliar with Cinder, it is a recently made public, free and open source library for professional-quality creative coding in C++. You can get more info on Cinder by visiting libcinder.org. To see more projects made with Cinder on CAN, click here. Here is the screen of HD Trailer of Tron (720p) apparently playing smoothly at 30fps […]
- CityScape – Characteristic of Copenhagen in a landscape mosaic / Plask CityScape is a collection of three collages that depict the most characteristic parts of Copenhagen that almost appear as separate cities within the city. Created by Daim Yoon, Mette Lyckegaard and Marcin Ignac, the trio used the pictures of the city as as a graphical landscape mosaic that represents the feel they have to them. The urban landscape of Copenhagen is flat and homogeneous so this more tangible side to it is shown in the chosen geometric that are all the same. Republikken is also a city in the city and the people working there even call themselves inhabitants of Republikken. With the opening of the new wing in Republikken it has been brought out of the backyard and out in the city and these collages take a small representation of the cities in the city all the way into Republikken. The brief for the project was pretty open the only “instructions” the designers got was that “it would be great with some WOW” and “it needs to be done in two weeks” so in order to spark the creativity the designers themselves put up some constraints like keeping the budget to a minimum, it should link the city and Republikken, there should be different layers to explore and it should be ambient light. The budget constraint made paper and attractive choice of material and expressing the concept through print and geometry's was and interesting challenge Programming environment Plask was used to dissect photos into pyramids which were then mounted on a grid frame. They used it for simulation, warping the photos, layouting the pages for print and making the laser cutter pattern. The piramids are stitched together using thread. CityScape can be seen in the Republikken's new premises. Daim Yoon | Mette Lyckegaard | Marcin […]
Posted on: 25/09/2011
- Senior Digital Designer at CLEVER°FRANKE
- Interaction Designer at Carlo Ratti Associati
- Creative Technologist at Deeplocal
- HTML / CSS Developer at Resn
- Climate Service Data Visualiser at FutureEverything
- Web Developer at &Associates
- Creative Technologist at Rewind FX
- Coder to collaborate with Agnes Chavez
- Data Scientist at Seed Scientific
- Data Engineer at Seed Scientific
- Design Technologist at Seed Scientific
- Creative Technologist, The ZOO at Google