Approxymotion and MisPortraits are two projects by Peter A Vikar currently an ESTm post-graduate studies at Sci-Arc, Los Angeles, designing material interactions with six-axis robot arms. Approxymotion is research project focusing on motion based forming while MisPortraits looks at the levels of accidentally with 6 axis robots doing feltpen graphics.
Approxymotion is an attempt to apply the logic of digital design into the physical space combining geometry, material properties and robotic arms. The nested relation (corner cutting) from rough to smoothened layers display the gradient condition from the accuracy of robotic motion control to the averaging behaviour of the elastic net.
Traditionally in architecture forms are transferred from paper/virtual space to building through fixed shaped moulds or as an assembly of many elements. My goal was to set the “mould” into motion, while maintaining the parametric nature inherited from the digital model. The result is a motion-form that computes between the initial motion input, the built geometry and its material properties.
The MisPortraits manipulates gradient value of the image by translating it to rotation values(6th joint) of a flat nose pen. Image ‘resolution’ 33 by 33 pixel. As the robot draws the line, the arm of the robot rotates thus manipulating the stroke width.
Peter A Vikar is an architect, visual creative. His design-work focuses on spatial typologies, and intricate geometries combined with inventive control and manufacturing methods. Peter has received his Master degree in architecture from the Masterclass of Greg Lynn at the University of Applied Arts in Vienna in 2008. Currently he is engaged in ESTm post-graduate studies at Sci-Arc, Los Angeles. Designing material interactions with six-axis robot arms.
/found via designplaygrounds
- Synchronous Dissections – Light drawing workshop at SCI-Arc with robotic arms Video below is the result of a workshop held last week at SCI-Arc teaching students about synchronous robotics. We already covered SCI-Arc robotics here, but the results this time are bullet time light drawings which instead of being constrained to a bullet time camera rig have the freedom to be viewed from any perspective. LEDs are attached to corners and centres of the panels whereas the DSLR camera is attached to one of the robotic arms. As the panels move in sync, the camera takes pictures of LED paths. No video editing was done to the light drawings, simply a series of still images. Below is also the original proof of concept for the workshop. Robot Control Software: esperantorobotics.com Research Institution: sciarc.edu, estm.us Instructors: Brandon Kruysman & Jonathan Proto thecognomen.net Students: TeamA: Al Ataide, Mehrzad Rafeei, Somayyeh Ramezani, Peter Vikar TeamB: Amir Habibabadi, Francisco Moure, Juan Osorio TeamC: Peter Kaoud, Eugene Kosgoron, Mira Lee (Thanks […]
- Hot Networks – Complexities and opportunities in collaborative robotics Created by Brandon Kruysman and Jonathan Proto, Hot Networks explores the complexities and opportunities in collaborative robotics. Using custom built software called esperant.O developed by Kruysman-Proto, Hot Networks is a collaboration of five industrial robot arms, with different tools and tasks, operating as one large network. Ideas of representation are embedded into the sequence, as well as material behaviour, and synchronous motion and tooling. The exercise shown here used heat as a way to transform plastic components in the form of piles and stacks, and also used paint as an additional process within the robotic sequence to get various levels of transparency. The robotic cell consisted of 5 Staubli robots with overlapping workspheres. Each robot performed a separate task; one filming the fabrication sequence, another picking and placing components, one airbrushing, one holding the worksurface, and one as the heater. The programming of the robots was designed so that the robot’s tasks are offset (meaning when one robot moves to get another piece of material, two of the other robots work together to paint the pieces that are placed.) The sequence is also designed so that the work surface is dynamic, where it has the ability to move into neighboring robots workspheres for collaboration, while having an extremely accurate way of positioning where the plastic pieces are in space. Many calibration tests were performed with the heating sequence that associated timing with formal implications. This allowed the objects to have material and formal characteristics that ranged between control and wildness. The objects could move between piles or stacks, depending upon the amount of heat applied and the relationships defined between robots. Variation in form was a result of the timing and coordination of robots. Created using a custom plug-in for Autodesk Maya - Esperant-O (written in Python) to translate animated character rigs to motion paths for Staubli 6 axis robot arms. (Val3 programming language). There were also additional python component to manage the robot to robot communication - CHARLA Lead Programmer and Developer: Brandon Kruysman Developer: Jonathan Proto thecognomen.net Research conducted at The Southern California Institute of […]
- Kinetic Pavilion [iPad, Processing, Scripts] This is a preliminary test model of a kinetic pavilion by Yannick Bontinckx and Elise Vanden Elsacker. The final model will contain 28 servo motors controlled by an Arduino Mega and a Duemilanove board all connected to Rhino & Grasshopper via FireFly. The demo movie below includes a TouchOSC & gHowl setup which allows the structure to be controlled via an iPad. Multiple kinds of ‘sketches’ can be uploaded to the pavilion - Weather Data: mean values of solar irradiation are processed and shape the pavilion depending on climate, daily, monthly and hourly data. (Software used: Autodesk Ecotect and GECO for Grasshopper.) - Human Interaction: control the pavilion using the iPad’s accelerometer and a TouchOSC XYpad interface (Software used: TouchOSC & gHowl. - Motion tracking: human movement/other movement through live webcam feeds. More information/updates including the final model will be posted in the coming weeks on Yannicks site. Also check out flickr for Processing app […]
- Lightplot – Robotic 3D light painting system by Ben Cowell-Thomas Created by Ben Cowell-Thomas, Lightplot is a robotic 3D light painting system. Animation is exported from 3DS Max, and imported into the Lightplot software which then drives a robotic arm to draw the models in the air. The software also controls a DSLR camera to take long exposure photographs of each frame of animation. The project grew from early experiments with Lego NXT and robotics. The latest edition comprises a custom built robotic arm controlled via Phidgets boards, which are driven by a stand alone Windows application written in C# and Microsoft .net. The exporting software is written in Maxscript within Autodesk 3DS Max. Early experiments started with a laser pointer and a simple Lego rig to move it about in a pan and tilt style arrangement. This was followed by Python scripting and a prototype built in Maya and the HPGL image format created to control old HP plotters. Ben found the format perfect as it is organised into a logical order for plotting, and has pen up and pen down commands, but best of all is stored as text. The open source image editing software Inkscape allowed him to trace images and output them in a HPGL format. His python script in maya animated the virtual rig to plot the images. The next step was to do the same physically. HIs web searching led him to the Aforge library, a great collection of code that could control Lego NXT via Bluetooth and soon had a working Lego robot. the Aforge library that Ben was using communicates with NXT via Bluetooth direct commands, however the NXT direct commands didn’t allow for exact positioning of the motors and he began looking for something other than Lego NXT. He finally settled for Phidgets, a set of "plug and play" building blocks similar to Arduino for low cost USB sensing and controlling motors. He ordered a Phidgets servo controller, a lynxmotion pan and tilt kit, along with two Hitec HS422 servos. The Phidgets board was easy to control via C# and he quickly had a demo up and running. He created an interface and began testing the plotter. The Canon 20D camera is controlled using it's remote trigger port with the help of small Phidgets 2/2/2 interface kit. The device currently takes about a minute to plot 50 polygonal edges. Ben rewrote Laserplot software for to support the 3D sequences. The core process includes taking the sequences of objects, convert their coordinates to 3D polar coords to match the rig, plan the shortest route through the edges for a quick plot, and then control the rig and camera while plotting. He spent a fair bit of time refining the max exporter, simple additions such as planning camera position and using this to cull back facing edges reduced plot times by half. The animations below are shot entirely in-camera, the figure having been animated in 3DS Max and then plotted by his 3D light painting system. You can read more about the process on Ben's blog. Project Page Dancer Animation : Sergei Shabarov | Music : Chris Clark – […]
- ‘Point Cloud’ – Arduino structure by James Leng breathes weather data Created by James Leng, Point Cloud is an attempt to re-imagine our daily interaction with weather data. Even with the modern scientific and technological developments related to weather and when we can deploy sophisticated monitoring devices to document and observe weather, our analysis and understanding of meteorology is still largely approximate. Weather continues to surprise us and elude our best attempts to predict, control, and harness the various elements. Point Cloud builds on this premise, exploring new ways to interpret and understand weather data. Weather has always had a unique place in our lives, because it has a multiplicity that encompasses both the concrete and the indeterminate. It is the intangible context within which we build our lives and our cities, but it is also the physical element against which we create protective shelter. Most of the time it is an invisible network that we can see but are not aware of; yet it can manifest in a spectacle or disaster, come forward and activate our senses, make us forget our rationality in delight or fear. Point Cloud is a sculptural form defined by a thin wire mesh, driven asynchronously by 8 individual servos controlled via Arduino. As whiteness of the hanging structure begins to disappear into the background, the viewer is treated to a constantly morphing swarm of black points dancing through midair. In the current prototype, the speed, smoothness, and direction of rotation are modulated to interpret a live feed of weather data. Instead of displaying static values of temperature, humidity, or precipitation, Point Cloud performs the data, dynamically shifting between stability and turbulence, expansion and contraction. flickr […]
- ‘Perception of Consequence’ by Volvoxlabs Perception of Consequence project represents two fluid-evolving forms are placed in a reversible entropic system and simulated to resemble evolving human states and […]
- Björk – Biophilia [iPhone, iPad, Sound] Biophilia is an iPhone/iPad release of Björk's latest album created in collaboration with Scott Snibbe and her longtime design collaborators M/M (Paris). Comprising a suite of musical pieces and interactive artworks, Biophilia is released as ten in-app download experiences that are accessed through a three-dimensional galaxy, the album’s theme song Cosmogony. The first single Crystalline, is now available, others soon to follow. Björk has collaborated with artists, designers, scientists, instrument makers, writers and software developers to create an extraordinary multimedia exploration of the universe and its physical forces, processes and structures - of which music is a part. Each in-app experience is inspired by and explores the relationships between musical structures and natural phenomena, from the atomic to the cosmic. You can use Biophilia to make and learn about music, to find out about natural phenomena, or to just enjoy Björk’s music. Biophilia opens into a three-dimensional galaxy with a compass allowing navigation between the 3-dimensional universe and a two-dimensional track list. By tapping on stars within the constellations you can access each in-app purchase which includes a combination of album art, games, interactive music notation which you can pan through in realtime, lyrics, and essays that explore Björk’s inspirations for the track. Whilst the app does "borrow" some of the concepts we have already seen in the AppStore and/or on CAN, it never the less offers an unique experience where different elements are weaved together with both sensitivity and precision. The experience is unified, building on different layers of visuals and sound, Bjork with Scott and M/M have just set a new milestone, showing real benefits that lie in collaboration. Considering there are still 8 tracks to go and although I have no intention to cover each one independently, I fear I may have to as from what I have seen in no.1, there are many more wonderful things yet to come. Get it > Biophilia was created by Björk in collaboration with interactive artist and app developer Scott Snibbe, and Björk's longtime design collaborators M/M (Paris). Crystalline (one of the in-app purchase tracks) was created by Björk in collaboration with Luc Barthelet, developer of The Sims; TouchPress, creator of The Elements app; and M/M (Paris). Platform: iPhone/iPad (Universal) Version: 1.0 Cost: Free + $1.99 in-app purchase Developer: Second Wind Ltd galaxy with a compass allowing navigation between the 3-dimensional universe and a two-dimensional track list Crystalline Crystalline track construction. interactive music notation Crystalline "game" like experience where you collect gems for your library which you can thereafter share via email as […]
- Fragments of RGB [Processing] Created by onformative, Berlin based studio founded by Julia Laub and Cedric Kiefer, Fragments of RGB is a project that explores the nature of digital image, it's construction and interaction with it. By segmenting the RGB pixels in the image and associating it to users by their proximity, the project aims to elevate individual relationships and perception of "point of view". We became interested in the observer’s personal view and in »re-projecting« this. The installation reacted to and changed with the viewer’s movement and, hence, his perspective and point of view. The illusion of a LED screen was destroyed and the RGB elements dissolved to form new, translated images and, thus, a transformed »reality«. Beside the installation that illustrates the sensitive interaction between person and image, fragments of RGB is also intended as a photographic series in which the transformations that occurred on the display were photographed. View fragments of RGB Flickr Set + Project […]
Posted on: 13/05/2012
- Engineering Lead at Wieden+Kennedy
- Web Developer at the Minneapolis Institute of Arts
- Junior Production Assistant at Resonate
- WebGL/3D Creative Prototyping Devs at TheSupply
- Freelance Interactive Producers at Psyop
- Art Director/Senior Designer at Stinkdigital
- Creative Technologist, The ZOO at Google
- Jr. / Sr. Software Developer at Minivegas
- Web Developer at Minivegas
- Digital Producer at Minivegas
- 3D Technologist at INDG
- Creative Director at INDG