Objects, openFrameworks
Leave a comment

Apparel by the Normals – Clothes that evolve in real-time with the user


We have already seen a number of projects that try to address both the concept of generative clothing as well as the new manufacturing techniques that allow the creation of one off, per order items. What does not seem to be addressed are the implications of these new technologies on the design process and how they change the role of the designer in this real-time, almost immediate, media culture.

Created by the new french collective NormalsA P P A R E L is a piece of clothing designed to exist both physically and digitally. Rather than utilising AR to only bring virtual objects to the physical environment, the project uses personal data as a method to both generate the dress and to decide how the dress may evolve.

Its real esthetic intent occurs on a 3d overlay, viewable with a camera and our custom-coded application, allowing users to digitally dress up. As it uses personal data as an input, the piece’s design evolves in real-time together with its user.

This is a first prototype for the ongoing project by Normals. Also part of many other interesting projects that address future scenarios through cross-media. It’s refreshing to see a shift from “output thinking” towards re-examining how the realtime data may affect many things we take for granted as static. If output is only an iteration of the generative process, this makes the always evolving algorithm the actual ‘product’.

This version of the project was made using openFrameworks with the help of: imagery analysis : OpenCV, ofxCv, ofxARToolkitPlus; 3D modeling : ofxAssimpModelLoader, OpenGL; camera settings : ofxUVC, ofxQTKitVideoGrabber, ofxYAML; GUI : ofxUI, ofxXMLSettings