Memo developed software to control this ILDA controllable laser projector (or any other pro laser projector), for realtime interactive laser interaction with openFrameworks, on OSX, and is releasing everything (including firmware of the etherdream) opensource.
Since these lasers take vector as input, i.e. you can’t just send it images/videos, he is first rendering everything to an FBO, then vectorising and re-calculating optimal paths, adding/removing points, curves etc. If your source graphics are vector (as it could be in many generative systems), it’s probably more efficient to keep it that way without rasterising first, but he was curious as to how the galvos (electromagnetic devices that move mirrors which reflect the laser beam and essentially create the patterns, text or animations) were responding, and this was a great way to understand the system and test the limits.
The later experiment uses Leap Motion to draw amoeba like objects via finger tracking. These are projected onto person wearing safety goggles since eyes and lasers are not great friends.
The Latest experiment, seen above, adds soap bubbles to the equation with some openCV tracking, edge finding and back to vectorising to be projected onto them. Of course all of this appropriately accompanied by Get Lucky / Daft Punk track.
Looking forward to see where this goes next.