Similar to MSA Remote, we mentioned few days ago, TouchOSC is one of the first iPhone / iPod Touch application that let you send and receiveÂ Open Sound Control messages over a Wi-Fi network. The application allows to remote control and receive feedback from software and hardware that implements the OSC protocol such as Pure Data,Â Max/MSP/Jitter, Processing,Â OSCulator,Â VDMX,Â Resolume Avenue 3,Â Plogue Bidule,Â Reaktor,Â Quartz Composer,Â vvvv and others.
I have been following the development over the last few months and excited to see that the just released 1.3 version now includes editable layouts. What this means is that you can create your own layouts depending on the perfomance/desktop application you are interfacing with. This is made possible thanks to Touch OSC editor, available for Mac and Windows and downloadable standard layouts that come with the iPhone application. You can use those as templates or you can create your own from scratch. The editor includes all the elements you see on the iPhone app and you can rearrange and scale them as you like. Once they are done, all you need to do is sync the layouts with the iPhone and you are ready to go.
Different to MSA Remote, TouchOSC is slightly more targeted towards music creation, not providing as much multitouch information as MSA Remote. Nevertheless, if you are looking for a simple, customizable interface to send and receive OSC messages, TouchOSC should be a great start.
The editor application is a free download whilst TouchOSC for the iPhone costs $4.99 and you can purchase it from the AppStore here.
You can find more information about TouchOSC as well as many examples on hexler.net.
- MSA Remote [iPhone] After a number of rejections, MSA Remote by Mehmet Akten is now finally available in the AppStore! MSA RemoteÂ is a remote control application for iPhone & iPod Touch that sends OSC messages over the wifi network. This allows you to control any OSC supporting applications such as Max/MSP/Jitter, PureData, Reaktor, VDMX, vvvv, Resolume, Quartz Composer etc. By mapping the OSC to midi on desktop (e.g. using OSCulator) allows further control of any application which supports midi such as Ableton Live, Cubase, Logic Pro, 3DSMax etc. In addition, developers can easily integrate OSC into their applications knowing it can be controlled remotely. Features: - Multitouch information sent using standard TUIO protocol for instant integration with existing TUIO clients - Accelerometer data for each axis (x, y, z) is sent - 64 faders (8 pages of 8 faders) - 64 triggers (8 pages of 8 triggers) - 108 key (9 octaves) VELOCITY SENSITIVE polyphonic keyboard. Yes, the harder you hit the keys, the greater the velocity. - Settings are automatically saved and restored - Multitouch area orientation can be set as desired - All information on protocols are documented in the app See alsoÂ iOSC [iPhone],Â synthPond [iPhone, MaxMSP] Platform: iPhone Version: 1.0 Cost: $1.99 Developer: Memo (Mehmet) Akten MSAFluid for processing (Controlled by iPhone) from Memo Akten on Vimeo. Graffiti Wall meets MSA Remote from Alex Beim on […]
- synthPond [iPhone, MaxMSP] Originally developed for Mac synthPond is a spatial sequencer and generative audio toy for the iPhone inspired by the work of Toshio Iwai. Unlike a normal sequencer where you place notes on a grid and a moving playhead plays them, in synthPond you place nodes in a field ie the pond. As the system is spatial, it's easily graspable and very intuitive, but also very deep. While it's easy for someone with no music knowledge to create a complex melody, synthpond is also suited for advanced musicians who are interested in generative musical composition. Created by Zach Gage, a digital mixed media and installation artist currently residing in New York City, synthPond is gorgeous. Whether you are on a bus, waiting for the train or just at home, plugging in your headphones and having a play is pure joy. Your creations can be saved and later edited. There are two major types of nodes; circular nodes release waves at certain intervals and hard-edged nodes that release waves when waves hit them. Moving these nodes about allows you to create complex and relaxing melodies. Additionally, because all the nodes are spatially organized, the audio generated can also be placed in a 3D space, occurring around the listener coming from the relative positions of each node. From wonderful menus to the actual ripple animations of sound hitting the nodes, synthPond provides a truly enjoyable environment to create melodies and perfect for the multitouch platform such as the iPhone. You can see more of Zach's wonderful work at his website here. The latest version 2.5 brings OSC Support, allowing you to connect the app to a number of different application that support OCS such as MaxMSP, Processing, Reaktor and many more. To get you going, you can download the example MAX/MSP patch here. In addition, a 'lite' version of the app is available if you would like to have a play (OSC Support not included). You can download it here. We've added a few movies below showing a demo of synthPond's capability as well as the most recent OSC integration with MAX/MSP. Make sure you also check out a number of composition examples from the synthPond's community. Enjoy. Platform: iPhone Version: 2.5 Cost: $1.99 Developer: Zach […]
- Grid [iPhone, iPad, oF, Processing] GRID is an interactive multi-touch sound visualization for the band Mathon and the ZKM AppArtAward 2011. Created for live events, the application consists of a desktop version for realtime graphic visualization of music - created using Processing and also an iOS version for interacting with the Processing app - created using openFrameworks. The basic appearance is based on a shape that deforms synced to an audio signal. A never ending journey through portal-like visuals, organic and technical scenes take the viewer into a surreal feeling atmosphere. Forming rapidly changing pictures out of those shapes the viewer seems to be part of electrical impulses catching short impressions of the human and his role in the universe..... Using an iPad or an iOS capable device people can diretly interact with the music visualization. By using Multi-Touch it is possible to manipulate the camera of the desktop visualization as well as changing between different scenes. Set up is quite simple as long as your ports are generally open. Using OSC, one app talks to the other, as long as they are on the same network and your port 12000 is open. You first check the IP of your desktop, fire up the desktop app then on your iPhone or iPad go into app preferences where all app settings are listed, find GRID remote and enter the IP address of the desktop. Now launch the GRID Remote on your iOS device you are good to go. The artwork is the result of a cooperation between the interactive arts collective Futura Epsis 1 based in Hamburg, Germany represented by Andreas Rothaug and the band Mathon from Switzerland who are responsible for the sound. The downloads are available for: iPhone / iPad (out next week), Mac OS 10.6, Windows and Linux. You can also get the new Mathon album "Terrestre" on iTunes or www.mathonmusic.ch. Platform: iPhone/iPad Version: 1.0 Cost: $0.99 Free Developer: Futura Epsis […]
- iOSC [iPhone] iOSC is a remote control application that uses the OSC (Open Sound Control) protocol. Using the OSC protocol over your deviceâ€™s built-in Wi-Fi connection, iOSC communicates with other compatible hardware and software nodes on your network. You can also remote control the middleware such as Max MSP, Processing, ActionScript (FLOSC) and many other devices that support the OSC protocol from your iPhone. Similar to TouchOSC and few other applications available in the AppStore,Â iOSC allows you to create a custom interface/controller for your desktop applications or custom build devices using boards like Arduino. What is unique toÂ iOSC is that you can control multiple computers via a single interface. See movies below for demo. You can also find great video demos and instructions on the app's site. Platform: iPhone Version: 1.01 Cost: […]
- 10 Creative Ways to Use the Accelerometer [iPhone] The iPhone's built-in accelerometer has created a world of opportunities for developers to create applications that are engaging, creative, innovative and fun. Here we bring you 10 creative ways the accelerometer has been used, from games to photography, music and reading. What lies ahead, only time we tell but the possibilities are endless. If we have missed any apps or you have any favourites you would like to share, please let us know by leaving a comment at the end of the article. Important to note is that most links in titles lead directly to iTunes AppStore so keep the app open. 10 Creative Ways to Use the Accelerometer on the iPhone iTM Tilt, TouchOSC, RjDj and Cosmovox | Sound Creation All four applications use accelerometer to control sound. Whilst RjDj is about generating already available sound samples, Tilt and TouchOSC are based around music creation utilising accelerometer as another way to control music production software on our computer. In addition both apps are used by visual artists to affect real time video especially TouchOSC as the new version currently in development also includes a desktop app for Mac allowing you to modify TouchOSC's interface and functionality. Another app is ZooZBeat with a more friendly interface that may appeal to a mainstream crowd. iWalk, Pedometer and iSteps Distance | Mapping and Exercise An interesting and very useful utilisation of accelerometer functionality to calculate the steps you take whether you're walking, jogging or running. In addition Jump Rope is a virtual jump rope for the iPhone and iPod Touch. Press the Start button and begin jumping using your iPhone or iPod Touch as if it were the handle of a jump rope. Night Camera, Moon Lighter and Camera Art | Photography Night Camera and Moon Lighter help you to take sharper photos at night or other low light conditions, by using the built-in accelerometer to trigger the shutter when it detects the camera being stable. Camera Art on the other hand, inspired by "Camera Toss", the technique to purposely take a long exposure photo with the camera moving, for some interesting light-show style photos. Also, LevelShot is an interesting application that uses your iPhone's accelerometer to help you take more level pictures. Instapaper Pro | Reading What makes this functionality successful is that the process of reading is synchronised with text progression. Similar to a number of book readers for the Palm years ago where the play button was available, Instapaper takes this functionality one step further by providing speed and direction controls. Response time is not necessary here as the position you hold your iphone is mainly static, ie as you progress through the text. No sudden change of direction, no fast response required. It could be said that this should be an optional feature added to all the apps incorporating some form of scrolling. Instapaper Pro tilt scroll demo from Marco Arment on Vimeo. Air Paint | Drawing in Space Early days for AirPaint but nevertheless it deserves are mention. Not so much about it's capabilities now but how this idea might be taken further. Air Paint is a good example of how accelerometer functionality on the iPhone can begin to suggest a form of spatial relationships between software and the environment. As you move your iPhone in space, Air Paint creates a path of this movement mimicking a form of light graffiti. The app does not yet utilise 3D nature of your movement but this is something that may be incorporated in the future, ie an ability to draw in space. Pulsar: Interactive Particle System, SandScapes and UON | Motion Graphics UON is a graphic based and tilt and touch control visualiser inspired by 'rave lights'. Mostly a screensaver that nevertheless uses accelerometer to alter the ever evolving image. Pulsar and SandScapes are both engaging particle generators that react to iPhone's orientation. Rolando | Games Since the launch of appstore we have seen a large number of predominantly games that utilise iPhone's accelerometer capability. Unfortunately those that incorporate tilt controls to replace tradition d-pad are hardly ever successful and with most recent releases offered as an option rather than primary controls. Another problem is that calibration plays an important part in creating a good tilt based game but not many developers include it. Rolando is most probably one games that utilise tilt controls way beyond the gimmick but as a valuable addition to the platformer games, perfectly balanced with multitouch. Â Rolando from Mark on Vimeo. Aqua Forest | Physics ..can calculate dynamics of almost any type of objects, not only solid materials, but also elastic body, plastic body, fluid, and gas utilising the touch screen and accelerometer. As you rotate the iPhone, so does the object move as if it were in a real life container, reacting to screen's edges and movement intensity. TouchOSC, Lego Mindstorm App | Remote Control TouchOSC can also be used in a number of different applications relating to remote control of devices and board such as Arduino and Processing. Lego Mindstorm app on the other hand, not available in the AppStore, is a small application developed for the iPhone that sends accelerometer data to the server process on the Digi board over UDP. The server application then sends command to the Lego NXT over Bluetooth. In addition the little robot has a video camera that feeds video back to the computer. A great little project and we hope to see many more similar projects using iPhone accelerometer as a remote control. Context Logger, SignalScope and Acceleron | Accelerometer Data Logging All of these applications can help researchers/iphone developers and interested users to record and interpret accelerometer data. Of course a programmer's understanding of the output data is required but in any case we hope to see more applications like this that can feed this data into desktop applications that can interpret it in ways more applicable to daily life. What we haven't seen yet in the AppStore: 1. Integration with Desktop Applications We would love to see more ways of data generated by the accelerometer to be integrated with desktop applications. Could we import motion paths generated by the iPhone into Photoshop? Maya and 3D Studio Max have been using motion paths to translate motion capture data into movement of animated characters. Can iPhone's accelerometer contribute to the translation of physical to virtual? 2. Social Aspect What seems to be missing is the social networking aspect of the data generated by the accelerometer. Could for example Facebook or Brightkite make use of this data, making a physical relationship to our devices a social activity. Could my iPhone's motion cause vibration of another device miles away? Could I wake someone up in another part of the world by shaking my iPhone? We have seen few examples of contacts exchange on the iPhone by shaking your device but surely there are move adventurous opportunities. 3. 3D Visualisation Not yet available in the AppStore, as demonstrated as a concept in this video, a way of visualising objects in three dimensions. Presuming due to SDK limitations it could be difficult to achieve. A great concept nevertheless where depending on iPhone's orientation the object is rotated in software to create three dimensional illusion. In addition, similar to the concept in Johnny Chung Lee's work with Nintendo Wii, especially the Head Tracking for Desktop VR Displays using the Wii Remote, it is an area still unexplored on the iPhone. We'll have to wait and […]
- One Rule [iPhone] From time to time an app appears in the AppStore that draws your intention in beyond the eye candy, fancy menus or text that has been perfected to a point where each character has been considered over and over again. Â There aren't many apps like this in the AppStore and I am sure that not many people will notice them either. These apps will be released and quicklyÂ buriedÂ thanks to all the valentine day apps (popular currently), numerous task managers and 101 new ways to send an email apps. One Rule is one of those apps with a simple task: Stay in the light. The appÂ is a very light puzzle game, it has only 9 stages, you can finish all of them in just a matter of minutes (we are still struggling with one), black and white, and a certain level of difficulty where your success is dependant on your creativity and how familiar you are with your iPhone. I will try to say no more because any hint of what the app does will spoil the experience. About:Â Simply tap the exit door to make him walk to there. Before that, make sure the path in the light. otherwise he will be dead. When you lookÂ closelyÂ at this appÂ closelyÂ you will realised that a lot has been considered. Not in the way of what to include but actually which elements to exclude. There is nothing in this app that shouldn't be there nor there is anything that is missing. We could argue that it requires more levels and maybe the developer will include this with future releases but that is not what the app is about. It's a great example of how simplicity and clarity in software design can create somethingÂ trulyÂ wonderful. Background is not black or white but a shower of pixels with different density that turn from black to white, a simple glowÂ effectÂ added to the text further translates the notion of light, subtle gradients,Â miniatureÂ exit sign and even the design of theÂ lamppostÂ are all things that makeÂ One Rule wonderful. If you are Rolando type gamer with high expectations, do not download this app, you will be greatlyÂ disappointed. If you loved Passage, which we adore, or have generally an interest in creative app development, this one is for you. Platform: iPhone Version: 1.0 Cost: […]
- PhiLia 01 [iPhone] PhiLia 01 is a new iPhone application created by the the Austrian visual artist Lia, one of the early pioneers of Software and Net Art who has been creating digital art, installations and sound works since 1995. The app is about artistic harmony, expressed through interactive generative movement, sound, form and color. Not too dissimilar from the Universal Everything's V&A Installation at the Victoria & Albert Museum - context very different, the App is an interactive piece that engages touch, motion and complexity aesthetic. When you start the app you are greeting with a vertical line ofÂ offsetÂ circles which if you shift the device to one side slideÂ acrossÂ the screen. You can continue to do this until you introduce anÂ effectorÂ which colours the discs the colour of your choosing and slows down the discs that pass through the area you touched. This results in an offset of affected discs creating an illusion of deforming spline made up of circle shapes. You can further increase the complexity by introducing trails, sporadicallyÂ rotating the device and changing the radius of the circles replicated. You can reset by double tapping or start again by shaking your device. A number of other options are includes such as ability to change the speed, disable all direction and turn off sound. I found that attempting to maintain integrity of the spline by gently rotating the device produces most satisfying results. Touching the screen at different areas of the vertical spline with associated rotation create a sensation of a living form that alters shape in an elegant and very poetic way. Whilst the app provides a wonderful first time experience it unfortunately falls short of continued engagement. Once you have tried it, it is hard to see how you may want to explore it further. I am generally of belief that apps on the iPhone should always try to provide alternative ways of engagement, ie an ability to export, connect and transport generated content by maintaining the original form. Whilst camera.app function creates still images it also offers ability to save these. If the app is an interactive motion piece, it should also provide a way to transport this motion experience to other devices including your computer. Ability to create screenshots of these wonderful interactive motion pieces in PhiLia 01 undermines the engaging experience the app provides. If this is a piece of interactive digital art, surely it should not conclude in the form of a screenshot? I only mention this because there are a number of apps available in the appstore that behave like a link between the iPhone and computer. Using OSC, you can use iPhone's multitouch and accelerometer capability to interface with custom desktop applications. Whilst most of these apps only provide a link, only synthPond explores this in the form of integrated functionality. PhiLia 01 should provide the same, a transportable interface to further experience beyond the limitations of iPhone/iPod touch hardware. It could morph itself, transport and behave like an interface to desktop app or a projection to be experienced full scale. It could also provide a WiFi link between neighboring devices running the same app that could stimulate real-time interactive collaboration and exchange. PhiLia 01 could be many things but definitely not a screenshot generator. Create your own personal art by using your fingers (multitouch) to interact with the elements on the screen. Tilt the device to change the direction of movement. Change various parameters that influence theÂ behaviorÂ of the elements by accessing the Menu, which you can open by tapping the lower right corner of the screen. Store your favourite moments of freshly-created absolute beauty and coherence by accessing the Menu choosing to save the image to your Photos. Am I beig harsh, yes, I am, but only for the single reason the app is so wonderful. Looking at Lia's past work you can see the beauty and amazing skill. Like many artists exploring the platform, one has to realise that even though the AppStore is a great platform to distribute digital artworks what unerlines this exchange between the user and the artist is a real opportunity to engage, inspire and motivate. It is more than just porting your creations to the platform but making most of the opportunities the platform offers in ways that can inspire even the mainstream audience. Created with openFrameworks Platform: iPhone Version: 1.1 Cost: $2.99 Developer: Lia Philia 01 Support Video from Lia on […]
- Morse Code as Interaction Input Methodology [Theory] Morse Code does not need much introduction but for the case of argument it may be relevant to understand the cause of discovery before we address issues why this may be relevant now when more sophisticated methods of communicate are around us. What I will try to do in this article is outline few questions and arguments why Morse Code as a method of input might be considered as a more intuitive and effecting way of exchanging information then traditional keyboard. I will then go on to propose uses, which span beyond text replacement but rather in terms of opportunities in the modern day of exchange and interface with information. Background Morse code, created forÂ Samuel F. B. Morse's electricÂ telegraph in the early 1840s,Â is a type of character encoding that transmits telegraphic information using rhythm. Morse code uses a standardized sequence of short and long elements to represent the letters, numerals, punctuation and special characters of a given message. The short and long elements can be formed by sounds, marks, or pulses, in on off keying and are commonly known as "dots" and "dashes" or "dits" and "dahs". The speed of Morse code is measured in words per minute (WPM) or characters per minute, while fixed-length data forms of telecommunication transmission are usually measured in baud or bps. The most popular current use of Morse code is by amateur radio operators, although it is no longer a requirement for amateur licensing in many countries. In the professional field, pilots and air traffic controllers are usually familiar with Morse code and require a basic understanding. Navigational aids in the field of aviation, such as VORs and NDBs, constantly transmit their identity in Morse code. Morse code is designed to be read by humans without a decoding device, making it useful for sending automated digital data in voice channels. Here are some usage examples: -- --- Â·-Â· Â·Â·Â· Â· -Â·-Â· --- -Â·Â· Â· M O R S E C O D E THIS ARTICLE IS ABOUT MORSE CODE - .... .. ... / .- .-. - .. -.-. .-.. . / .. ... / .- -... --- ..- - / -- --- .-. ... . / -.-. --- -.. . Interface Traditionally, Morse code was extensively used for earlyÂ radio communication beginning in the 1890s. For the first half of the twentieth century, the majority of high-speed international communication was conducted in Morse code, using telegraph lines, undersea cables, and radio circuits. Vibroplex is a tool we should also be familiar with. The paddle, when pressed to the right by the thumb, generates a series ofÂ dits, the length and timing of which are controlled by a sliding weight toward the rear of the unit. When pressed to the left by the knuckle of the index finger, the paddle generates aÂ dah, the length of which is controlled by the operator. MultipleÂ dahs require multiple presses. Left-handed operators use a key built as a mirror image of this one. Morse code has also been employed as anÂ assistive technology, helping people with a variety ofÂ disabilities to communicate. Morse can be sent by persons with severe motion disabilities, as long as they have some minimal motor control. In some cases this means alternately blowing into and sucking on a plastic tube ("puff and sip" interface). People with severe motion disabilities in addition to sensory disabilities (e.g. people who are also deaf or blind) can receive Morse through a skin buzzer. More recently, with the massive SMS messaging adaptation there have been cases to prove Morse Code being a faster input than SMS. Â Jay Leno did a text off between two text messengers and twoÂ Morse coders who won the contest.Â Unfortunately the video is not longer available. Another recent case is Fun with flashing lights (HOW TO - Build a morse code generator) Arbitraryuser writes -Â "Not sure if you're interested, but I put this together as a social experiment to see how long it would take for someone to notice that the lamp flashing in my window was actually morse code... less than 24 hours later the cops were at my door.... aka How to build a Morse code signaler and see how long it takes before someone figures it out." -Â Link. a series of paintings in which the individual panels visually & aesthetically blur different abstract data sources, including satellite images, stock market charts, corporate logos, or morse code communications. infosthetics Opportunities What the above examples suggest is that Morse Code is an interesting alternative input method that not only allows information to be inputted in a quick and efficient way but also that is not limited to tap devices only but body as a whole. The way we navigate the environment and operate devices using our touch sense, the current way of inputting data in the form of touch interfaces is limited to our comprehension of interface, ie seeing the available options and using our fingers to control the flow of information. Morse Code offers an alternative methodology of reading and writing information. Not only limited to fingers, but the information can be inputted by our gestures, motion, exhalation and many more. Because the system is based on rhythm, this information can be transmitted in many different ways independently from seeing the input or relying on feedback. Taking keyboard import for example relies on 3 elements; knowing the position of letters (seeing), typing the letters (fingers) and once again receiving the feedback of typing by looking at what we typed (seeing). Similarly, contemporary touch interfaces such as multitouch devices like the iPhone rely once again on knowing where the information is (seeing), tapping on the information (fingers) and feedback, ie whether you selection was correct (seeing). With morse code as the form of input this interaction between the devices and user could be reduced to two elements; typing the information (fingers, shake, footsteps) and feedback (seeing, hearing). What should also be considered is that this could begin to suggest custom designed acronyms for interaction. Take for example "confirmations". If the application offers a choice, an "accept" button (in tradition sense) could be replaced with a single morse code input, a short code or an acronym. Applications could begin to develop own interaction language using morse code as basis. Learning to use application would imply learning morse code shortcuts or acronyms as an optional (additional) quick way of interaction. Important to note that this would be available to everyone with or without disability whether the input is using fingers or the rest of your body. Devices such as the iPhone offer accelerometer data input as well as touch where for example registered motion could replace tap. When we look at recent interaction projects, art installations the embody similar principles. Motion capture that tracks your movements, attempts to recognise your behaviour patterns could be built upon morse code behaviours, rhythms with app behaviour acronyms that are custom to the application. Memo Akten refers to this as "creating new instruments", tools for interaction. His recent body paint projects allow interaction between the view and canvas by capturing their motion and projecting paint effects on a virtual canvas. Whilst this interaction is about fun, it does provoke questions of input of information and whether one could begin to understand these gestures as words using Morse Code. Microsoft of-course is (technologically) leading the way. Their current natal project builds a lot on custom behaviour patterns but not established methods like Morse Code, why not? Many references and many projects, probably too many to mention. What is certain is that Morse Code presents an established communication method that can be still used / implemented in contemporary projects. We do not need to reinvent patterns of behaviour but rather build on established ones. Morse Code, however old, is a simple, easy to learn and easy comprehend method. Whether this be gestural or text based communication, patterns are here, we just need to use them. 'Multitouch' is so 1840's! How to learn it I have come accross this site that seems to outline few ways to easily learn morse code in 1 minute (not convinced). Another alternative, if you have an iPhone, to jailbreak it and use TypingSebastian (AppStore) with plugin for iGitDahText. The learning process should be fun and engaging. Do note that this requires a Jailbroken iPhone which will void your warranty. Other alternatives are Francis Bonnin's Morse-It (AppStore) and Mc Morse Code (AppStore). See also Morse Code Translator and, An Xiao on Twitter as an Artistic Medium:Â Morse Code Vs […]
Posted on: 04/07/2009
- Engineering Lead at Wieden+Kennedy
- Web Developer at the Minneapolis Institute of Arts
- Junior Production Assistant at Resonate
- WebGL/3D Creative Prototyping Devs at TheSupply
- Freelance Interactive Producers at Psyop
- Art Director/Senior Designer at Stinkdigital
- Creative Technologist, The ZOO at Google
- Jr. / Sr. Software Developer at Minivegas
- Web Developer at Minivegas
- Digital Producer at Minivegas
- 3D Technologist at INDG
- Creative Director at INDG