comments 4

Solace: How We Made the Interactive Speculative Fiction

As a part of a new series of posts inviting artists and curators to share and discuss their latest projects for the CAN audience, we’d like to introduce you to Evan Boehm, a creative director and artist now based in Montreal, Canada. Over the past 8 years, Evan has been lucky enough to have worked with some of the world’s largest clients (Google, The White House, Intel) alongside some amazing coders, animators, actors, set designers, modellers, writers, producers and magicians as a creative director.

The latest project, titled Solace, is a collaboration with Nexus Studios , and is an interactive animated film based on celebrated science fiction writer Jeff Noon’s short story about a near future in which marketing and addiction are disturbingly intertwined. The final work, which features Gethin Anthony of Game of Thrones fame, can be viewed at and we now hand it over to Evan. Enjoy!

“It’s about drugs.”

1994, and my mom was in a small bookshop outside London, looking for a book to bring back home to California. In her hand was VurtJeff Noon’s debut novel, but the guy behind the counter was trying to dissuade her from buying it. It was supposedly unsuitable for a 13-year-old kid because of the heavy drug use. My mom, being my mom and being told that she shouldn’t buy it, bought it. I, of course, loved it.

After that, every time my mom would visit her family in London, she would bring back the latest Noon work. One of those books, released in 1999, was a collection of short stories entitled Pixel Juice. In it, was Solace.

Since then, the story was always stuck in the back of my mind. As I grew-up and became an animator, I wanted to do something with it but wasn’t quite sure what. At first, I thought of a creating a traditional animation but as my tastes changed and I learned how to code, I drifted towards the idea of an interactive film. I made a few JavaScript sketches here and there but nothing big.

One day I was having coffee with Matt Pearson and we started talking about old and dusty ideas for projects. I mentioned Solace and amazingly, he knew Noon. Post chat, without my permission or knowledge, Pearson sent Noon my work. A couple emails later and I was in a coffee shop talking to Noon and committing myself to turning Solace into an actual project.


Solace is about a drink, Spook, whose flavor is determined by how you open it. Twisting the cap in different ways produces different flavors. There are six flavors in total but you can combine them in different ways. A simple idea, but one that allowed Noon to build a future vision about marketing, addiction, memory and DNA manipulation.

I always imagined the story rendered by liquid, or rather, 6 liquids — one for each flavor. They would mix together to form objects and movement. The challenge was getting these ‘six streams of flavor’ to convey character and emotion. The project was going to be real-time experience so the first challenge was to figure out the best way to make real-time liquid form animated characters.

One of the beauties of animation is that you can give life to inanimate objects and shapes. Aladdin’s rug in the Disney film is a master class in physical comedy. Chuck Jones’ (of Bugs Bunny fame) excellent The Dot and Line uses simple movement and music to make a straight line (a line!) lovesick for a circle. Could I even come close to match Jones’ mastery of emotion with generative liquid? Of course not, but I would have fun trying.

I started collecting references and editing the best bits into a reference film. The edit below was continually added to over the course of the project:

↑ liquid animation references. credits in the video

Visually, I wanted something that felt warm, playful, simple and ultimately child-like. The idea was to set up an innocent world and time that was subverted and destroyed by Spook. I also wanted the liquid to feel alive, as if each droplet had a personality like Jones’ Dot. The goal was to create a system of a few simple shapes that could create meaning and be controlled via code.

The Particle System

Particles are “programmer porn”. It is always about how many particles you can render, with the fastest frame-rate, in the smallest memory footprint. The challenge for me was to do that plus give each particle a personality and sense of motivation that could convey emotion.


My experiments led me a single two-dimensional particle. In animation, you give an illusion of life through movement and a sense of mass. I created many tests where I tried to give a single circle life through code: how it moved, how it responded to events, how the physical world around it altered its behavior, etc.

The shape system I landed on was a simple vector circle, created out of four points, each with two handles (12 points of control in total). I could deform the circle in any way I wanted by controlling the movement of each point and its handles. The amount of detail was limiting but also helped to define the look of the piece overall. Combining multiple circles allowed me to create more complex shapes. I experimented with two.js and raphael.js but decided on paper.js for speed purposes and the feature rich api.

↑ Complex particle shapes


Each particle’s movement is based on a simple physics system where the particle is always accelerating as fast as it can, to where it wants to go. You give the particle a goal and it will race to that location. As it moves, I stretch the particle to show the effects of the velocity. Hurtling so fast, it will overshoot the goal and have to turn back. To illustrate the imaginary forces acting on this particle, I squash the circle in the direction of velocity. This makes it look like the back of the particle is pushing into the front, kind of like how you move forward when a car brakes quickly.

↑ early squash and stretch test

On top of this, to give the particles a sense of playfulness, I strung the goals along Bézier curves. These curves, made of additional points and handles were a hand drawn with a custom HTML5 editor to give a human touch to the movement. So instead of going straight to the goal, each particle dips and swerves.

Visual and Movement Development

Once the system was in place, I needed to draw a character with it. I’m not a character designer so Nexus and I went searching for one that could grasp the limitations of the system and hopefully elevate above it. I collaborated with a few illustrators to do tests, but Robin Davey’s first attempt knocked it out of the park. He would go on to design each of the still assets.

↑ Robin’s first attempt was pretty close to the final design

To help create the designs, and later the animation, I developed a pipeline in Adobe After Effects. With AE’s scripting capabilities, I created a project file that locked features to just the bare bones and exported custom compositions into a bespoke file format. Originally I built an in-browser editor that mimicked the HTML5’s animation engine but with AE, you have an army of animators who can pick up the system with only a minor learning curve. Getting animators up to speed with something they knew was much easier than teaching and debugging my own editor, so I eventually ditched it and just used AE.

Everything was designed in the custom AE project (or the earlier in browser editor) then exported as JSON objects. This JSON housed a series of vector points that the engine would take and redraw as physics based particles.

The trickiest part was getting the system to produce something similar to what AE produced. The animations aren’t static and predefined. Physics, the user and other random actions changed how a particle moved, altering the animators’ intentions. For animators used to crafting a single frame to be perfect, this was a challenge. I spent a lot of time being very particular about movement on one hand and trying to convince the animators to be loose and easy going on the other.

Telling the Story

The original concept could be boiled down to a cartoon you could play with. Like a more traditional film, I wanted the story to be front and center — the interaction should not overpower the narrative. The challenge was to balance playability with the joy of sitting and listening to a good story, well told.

From the start it became apparent that there was a conflict between listening to the narrator’s voice and playing with the interaction. If any of interaction was too complicated, or absorbing, users would miss key moments in the narrative. On the other hand, not being a traditional animation, if the interaction wasn’t strong enough, a user would get bored. This was an experience that couldn’t use traditional game mechanics, or puzzles in general. It had to be closer to a film that you are mindlessly toying with. Nothing to solve. Nothing to complete.

↑ Early paper prototyping

Through numerous iterations and user testing, the end result can be described (in the trendy parlance) as a ‘2D walking simulator’. The user plays with each of the 17 scenes, but their actions don’t drive the narrative. Each scene and the interaction within the scene is a metaphor for the story at that particular point. Sometimes it is as simple as what the narrator says is what the user sees. At other times, the interaction referred or alluded to greater themes of the story.


I wanted the drink to feel futuristic and synthetic. Spook was sugary and addictive and I wanted it to feel like you were swimming in Cherry 7up. Early references included a mixture of industrial noise and PC Music.

From the start, the idea was for the audio to be more than a series of pre-recorded sound clips triggered by user action. I wanted it to be dynamic and responsive. This demo by Cory O’Brien, where the sound is generated by the position of the mouse in relation to points on the screen was an early inspiration for what we could push for. I took this demo and my collected references to friend and sound artist Dave Meckin and persuaded (begged) him to take the project on.

Dave is hoping to write a more technical article on what went into the sound design aspect of the project but the TLDR version is that he used Tone.js to dynamically create synths to track the particles and generate audio chords. Everything you hear (except for background pub sounds) is generated on the fly in real-time.

. . .

This is a pretty short article for a project that has been almost 20 years in the making. If I have learned anything, it is the value of support and collaboration when money isn’t a incentive. I promise my next project will come much sooner.

The final work can be viewed at

Nexus Studios | Evan Boehm | Jeff Noon | Gethin Anthony (Featuring)

  • karolsenami

    Such an inspiring case study, thank you !

  • Joe Corr

    Nice work! Super engaging and well crafted. Moar plz

  • Raju Surelia

    Love your work, Even Boehm. Every single project is amazing!. Cant wait for the next one.

  • Wow uber cool!