Created by David Dalmazzo, Dazzled Project is an attempt to compose a generative particle environment that could at the same time create structures and sounds. The application uses both MaxMSP and Cinder via OSC bridge allowing sounds generated from max be fed directly into Cinder app which generates the visuals.
I would like to program patterns and physics simulations with the aim to compose music structures that has a direct representation on a formal shape. One of the influences for this project was some examples that Robert Hodgin like Solar Rework. But in this case the idea is not to have a sound reactive visuals, but visuals that create a generative sound and music compositions.
David writes that the videos below are just the first part of the project. He is also planning to add rhythmic patterns based on constant rebounds or elastic connections between particles. Dazzled Project was supported by Generalitat de Catalunya.
David Dalmazzo is a musician and digital visual artist oriented to interactive audiovisual composition. Focus on live performance and dedicated to the investigation on informatics tools that contribute narrative and composite elements to the scenic arts.
- Jacob’s Cave [Cinder] Latest from David Wicks (sansumbrella) and selected for the WrittenImages book is this beautiful geometrical study created using Cinder. David writes: Jacob's Cave developed in response to seeing a range of helictite formations in a show cave in Missouri. Helictites grow downward, but also spiral up when the influence of forces like capillary action become stronger than gravity. I created a physics simulation which grows forms in a similar manner. The drawing is generally under the control of gravity, but other forces cause it to grow upward and into spiraling shapes. This was one of my favorite WrittenImages submissions. Created using Cinder + Box2D. Previously: Distance and Common Desires [#Cinder] Box Clock […]
- The Company [Cinder] "The Company" is the latest project by Andrea Cuius and Roland Ellis commissioned by Bring To Light Festival NYC. A suspended surface of 76 tungsten lamps form a catenary arch, playing host to live performances and revisiting the sounds of the 19th century East River industrial icons. The piece intends to bring back an atmosphere informed by the architectural legacy, a machine being delivered to occupy the space that was once a bustling industrial environment. By either producing sounds or just reactive to the inputs from the environment, The Company is a sound reactive light installation. The software was developed in Cinder and creates audio sampled in real time. The sound analysis is computed with Ableton Live using a Max For Live patch developed by Henrik Ekeus, it performs the Fast Fourier Transform, beat detection, attack detection and sound filtering, communicating with the custom software through an OSC connection. Andrea Cuius is a creative coder born in Italy now living in London, he works with different technologies to create connections between objects, audiences and environments. He engages the audience with data sampled from the environment to create immersive and provoking experiences. Andrea has been working with some of the most prestigious companies and collectives in the UK such as rAndom International, United Visual Artists and Cinimod Studio, contributing to develop large scale art installations and architectural projects. Project Page See also: So.. I was at a party last night [Cinder] and Hyundai i40 Reveal [openFrameworks, […]
- Hairrrr [Mac, Cinder] Created by Doug Pfeffer, Hairrrr is a Mac application that converts images into fields of hair. A while back Doug posted some sample generated images that sort of looked like body hair on the Cinder forums. Although he still hasn't found a project to integrate it with, he touched it up a little bit and wrapped it up in an OS X application. You can load your own image (through that button or drag n' drop) and save the images out. Options include hair length, density and also ability to invert. Download here (3.3mb). See also activecellmedia.com Doug Pfeffer is a developer at the Barbarian […]
- illucia [Processing, MaxMSP] illucia is a OSC based codebending instrument by Chris Novello aka paperkettle. It is a USB device with physical jacks that correspond to software patch points, which can be connected and disconnected using patch cables. It is also a console for routing information between computer programs, and strives to create relationships across systems that don't usually interact. Chris has already designed a number of applications that interact with the console, some using Processing, others using MaxMSP. Whilst the applications themselves are quite simple they are nevertheless means to raise questions how controlling a particular application via a specific interface can change the experience of it. For now, there are four applications Chris intends to release as downloadables. Even though they still require the Illucia console to experience fully, they are OSC based to they can be controlled via any OSC interface including a number of iPhone/iPad/Android applications already available. The four existing applications for illucia are (see video): ·PCO (Paddle Controlled Oscillator): a classic ball and paddle game. When pushed, it morphs into a function generator and spills abstract art. ·Soviet Life Sequencer: falling Tetromino pieces generate step sequencer patterns, all remixable by Conway's Game of Life. ·War Machine: a crosshair blasts colorful explosions into a dense nest of shoots that approach from above ·Pile of Secrets: a codebendable text editor More videos and deeper documentation is on the way... In the meantime you can follow on Twitter or FB for more information. Project Page (Thanks […]
- Kinect – One Week Later [Processing, oF, Cinder, MaxMSP] Last week we wrote about the wonderful work that happened over the weekend after the release of XBox Kinect opensource drivers. Today we look at what happened since then and how the Microsoft gadget is being utilised in the creative code community. In case you missed our post from last week, you can see it here: Kinect – OpenSource [News] Chris from ProjectAllusion.com got to play with the Kinect and one late night he made this little demo in Processing using the hacked Kinect drivers. The processing app is sending out OSC with depth information based on the level of detail and the defined plane. The iPad app is using TouchOSC to send different values to the Processing app. - Daniel Reetz and Matti Kariluoma have been playing with Hacking a Powershot A540 camera for infrared sensitivity enabling you to see Kinect projected infra red dots in space. Microsoft’s new Kinect sensor is garnering a lot of attention from the hacking community, but the technical specifics of how it works still aren’t clear. I am working to understand the technology at a fundamental level – my interest is in the optical side of Kinect. My ultimate goal is to make the sensor nearsighted, so that the depth resolution can be used to scan small objects. The first step in understanding a technology is to look at it — that’s why teardowns like this one at iFixit are so important. - Ben at KODE80, the creator of Holo Toy created also this quite wonderful demo of Kinect being used to track your position in space and show image on the screen based on your position thus creating an illusion of 3D image. Several months ago I threw together an OSX HoloToy demo that used OpenCV and the iSight camera to replicate the facial recognition head tracking used in the iPhone 4/iPod touch version. This seemed like a perfect place to insert the Kinect! The above video shows various scenes with the perspective controlled via the Kinect. At this point it is simply tracking a specified depth range however with motion tracking of the depth map and other techniques, this could be really special. - Philipp Robb has some early experiments with a Microsoft Kinect depth camera on a mobile robot base. Say hello to KinectBot. The robot uses the camera for 3D mapping and follows gestural directions. It's basically a pimped iRobot Create with a battery-powered Kinect which streams the depth and color images to a remote host for SLAM and 3D map processing. - Peter Kirn covered the work Ben Tan X was doing with the Kinect system to perform MIDI control. Result: depth-sensing, gestural musical manipulations! From the description: Coded in C#.net using this: http://codelaboratories.com/nui Very hacky ugly, yucky, alpha prototype, source code available here: http://benxtan.com/temp/pmidickinect.zip Next project is making a version of pmidic that uses Kinect. Then, you can control Ableton Live or any other MIDI software or hardware with you limbs. Isn’t that amazing!!! If you are interested, you should also check out: http://pmidic.sourceforge.net/ http://benxtan.com - Yesterday, Stephan Maximilian Huber posted this video of Joy Division-esque realtime 3D scan using Kinect where points are connected only horizontally. Very effective and quite beautiful. - Simultaneously, Dominick D'Aniello is working on Kinect Object Manipulation, creating a system using openFramework that allows you to rotate and manipulate 3D objects using Kinect. A threshold is used on the depth-map to filter out everything but my hands, and then blob detection is used to locate their centers. This information is then used to scale and rotate an onscreen object. Note that because the Kinect provides depth information, the object can be rotated on both its Z and Y axis. With a bit of work, a gesture could theoretically also be made to rotate along the X axis. - Few days ago we posted a quick installation prototype by Theo Watson and Emily Emily Gobeille (design-io.com) with the libfreenect Kinect drivers and ofxKinect (openFrameworks addon). The system is doing skeleton tracking on the arm and determining where the shoulder, elbow, and wrist is, using it to control the movement and posture of the giant bird! - Another great news is that Kinect now also works with MaxMSP created by Jean-Marc Pelletier. It's still very alpha. I still have to implement "unique" mode, multiple camera support, proper opening/closing, and I can't seem to be able to release the camera properly but the video streams work as they should. Read more on the forums. - Also, Kinect now runs in VVVV. Late evening live coding at node10 by Julien Vulliet (thanks @defetto) - Last week Rui Medeira also ported drivers to Cinder framework and this morning Robert Hodgin aka Flight404 posted these videos to his vimeo account. Made with Cinder and the Kinect sensor. Runs in realtime. Another great week of Kinect projects. The work is finally beginning to take shape beyond tech demos which is wonderful to see. I highly doubt will be posting any more updates of this nature as more work will develop as individual projects which will require their own posts. Big up once again to the communities including openFrameworks, Processing, Cinder, MaxMSP and many […]
- Cymatic Ripple [C++, Cinder] In his attempt to convince multitouch companies to give him their hardware in exchange for better demos or maybe just doing what he does pretty well; making the rest of us drool over his work, Robert Hodgin aka Flight404 has been making these incredible fluid/tactile (destined for multitouch) visuals. Here is a little info: I am using a velocity map combined with a diffuse/normal/bump map set from FilterForge. The distortion is happening in the vert shader, but the normals are being calculated first outside the shader because I haven't figured out how best to push that math to the GPU. So much to learn! Made with Cinder, a C++ framework being developed by Andrew Bell and the team at the Barbarian Group. Real […]
- synthPond [iPhone, MaxMSP] Originally developed for Mac synthPond is a spatial sequencer and generative audio toy for the iPhone inspired by the work of Toshio Iwai. Unlike a normal sequencer where you place notes on a grid and a moving playhead plays them, in synthPond you place nodes in a field ie the pond. As the system is spatial, it's easily graspable and very intuitive, but also very deep. While it's easy for someone with no music knowledge to create a complex melody, synthpond is also suited for advanced musicians who are interested in generative musical composition. Created by Zach Gage, a digital mixed media and installation artist currently residing in New York City, synthPond is gorgeous. Whether you are on a bus, waiting for the train or just at home, plugging in your headphones and having a play is pure joy. Your creations can be saved and later edited. There are two major types of nodes; circular nodes release waves at certain intervals and hard-edged nodes that release waves when waves hit them. Moving these nodes about allows you to create complex and relaxing melodies. Additionally, because all the nodes are spatially organized, the audio generated can also be placed in a 3D space, occurring around the listener coming from the relative positions of each node. From wonderful menus to the actual ripple animations of sound hitting the nodes, synthPond provides a truly enjoyable environment to create melodies and perfect for the multitouch platform such as the iPhone. You can see more of Zach's wonderful work at his website here. The latest version 2.5 brings OSC Support, allowing you to connect the app to a number of different application that support OCS such as MaxMSP, Processing, Reaktor and many more. To get you going, you can download the example MAX/MSP patch here. In addition, a 'lite' version of the app is available if you would like to have a play (OSC Support not included). You can download it here. We've added a few movies below showing a demo of synthPond's capability as well as the most recent OSC integration with MAX/MSP. Make sure you also check out a number of composition examples from the synthPond's community. Enjoy. Platform: iPhone Version: 2.5 Cost: $1.99 Developer: Zach […]
- Heart Chamber Orchestra [MaxMSP] The Heart Chamber Orchestra - HCO - is an audiovisual performance. The orchestra consists of 12 classical musicians and the artist duo TERMINALBEACH. Using their heartbeats, the musicians control a computer composition and visualization environment. The musical score is generated in real time by the heartbeats of the musicians. They read and play this score from a computer screen placed in front of them. Custom-made software analyzes the data and via different algorithms it generates the real-time musical score for the musicians, the electronic sounds and the computer graphic visualization via 2 projectors in the space. TERMINALBEACH is a collaboration between PURE (Vienna/Berlin) and BERGER (Helsinki) that began in 2002. TERMINALBEACH is involved in the building of audiovisual entities. Linked performance environments intertwine abstract sonic narrations with a strong visual language of morphing surfaces to create a synaesthetic experience. Read more about the project by visiting heartchamberorchestra.org (via […]
Posted on: 21/02/2011
- Engineering Lead at Wieden+Kennedy
- Web Developer at the Minneapolis Institute of Arts
- Junior Production Assistant at Resonate
- WebGL/3D Creative Prototyping Devs at TheSupply
- Freelance Interactive Producers at Psyop
- Art Director/Senior Designer at Stinkdigital
- Creative Technologist, The ZOO at Google
- Jr. / Sr. Software Developer at Minivegas
- Web Developer at Minivegas
- Digital Producer at Minivegas
- 3D Technologist at INDG
- Creative Director at INDG