Behind the Scenes of Autre Ne Veut’s ‘Okay’ video image

Behind the Scenes of Autre Ne Veut’s ‘Okay’ video John Robson explains how he used motion capture to create an emotional human performance.

John Robson has worked as a motion designer and director for more than two decades. On his own time, though, he is a talented filmmaker who always enjoys a challenge and often works in unconventional ways.

So when he wanted to try combining motion capture and Apple’s ARKit with Cinema 4D, Houdini and Redshift to capture an emotional human performance, he reached out to a musician he’d always wanted to work with — Autre Ne Veut.

It was a tricky collaboration because Autre Ne Veut (Arthur Ashin) lives in New York City and Robson is based in Portland. But the two of them hit it off and, as it happened, Autre Ne Veut had just finished producing a single called “Okay” that he was thinking about putting out. So they brainstormed ideas about how to make a video without actually meeting in person to film anything.

We talked with Robson about some of the unusual ways they pulled that off, including how he created the scenes in the ocean by fake swimming on top of a yoga ball while wearing a motion capture suit.

Robson: I had been working as a director at HouseSpecial but last summer I got a call from Samsung. They’d been following all the digital human stuff I’d been doing in my films, like “Safety First”, and they have their own digital human AI division called Neon.

It's a group of incredibly talented scientists and engineers who have a wealth of knowledge that's equal parts inspiring and mind blowing. They looked to me because they wanted a storyteller to help them create the narrative and representation of the digital humans they are working on for outreach and marketing.

I’ve learned a ton these past few months, and it’s helped me on a technical side as a filmmaker and storyteller.

Robson: I was just finishing up “Okay” when I got the Samsung job. After using Mixamo and animating by hand on previous projects, I really wanted to do something to better capture human emotion.

Arthur had a vision of being in an endless ocean and using that as a metaphor for the relationship he sings about in “Okay.” From a practical sense, though, it’s dangerous and expensive to shoot in the ocean, so I figured it was a perfect time to put my new motion capture rig to the test.

He has made a bunch of amazing albums, and I love his voice and his videos are always very artsy. The story is about isolation and acquiescence in this endless void under the ocean. It was a lot to think about because you have to consider reflections and lighting. I did most of the concepting in Houdini, and I got a buddy of mine at HouseSpecial to make an endless ocean. I also made some procedural kelp forests and used Redshift for lighting.

Since he’s in Brooklyn and I’m in Portland, and we had a small budget, we worked by corresponding back and forth and it worked really well. I had him send me a bunch of photos of himself, and he also recorded his facial-tracked performance with his iPhone. I wanted to get the nuances of how he sings before I tried animating the whole thing in my basement.

Robson: I did try not to make a fool out of myself, but that was hard since I had little to no keyframes and needed to do just about everything as performance-based capture. I started out with some characters from Mixamo before I got the suit but, once I had that, I would put it on and act out whatever was happening in the scene.

The character does a lot of swimming in the video so I would lie on top of the yoga ball and do a take and then sit in my chair to capture my arm movements and then put it all together in another take. I’m taller than he is, so I had to readjust the rig to look more like him. It was interesting trying to puppeteer the head and body, so it looks like one thing. The environment was procedurally built around the character, using things like rocks and seaweed from Megascans.

Even if I choose to animate in Unreal or Houdini, Cinema 4D is always my go to for previz because the cameras are great and the process of exploring shots and experimenting goes quickly, which is really important. So that’s where I start and then I export to Houdini or whatever program I finish in.

For this, I used Redshift in Houdini for lighting, texturing and rendering and Nuke for compositing. Magic Bullet Looks helped me get the film emulation I wanted on top of everything. Editing and final color was done with Black Magic Resolve.

Robson: I got a photogrammetry scan of Arthur and stitched that onto a body. Then, I did that whole rewrapping process using the perfect head that I had all set up and morphed it into the shape of his head.

It's very difficult to capture the range of human faces. You have to use hundreds of blend shapes sometimes to get the right expression. But it is impressive that you can do that on the consumer level now. I’m really excited to see where the technology goes.

Arthur was pretty blown away by the process, though there was that weird moment when he saw a digital version of himself and had that feeling that it was him but not him. We’re similar on a creative level, but he put a lot of trust in me. His stuff has always been live action and more arthouse style. This process allowed us to tell a story that would have been impossible to do through live action.

Robson: The reaction has been great. People have talked about how they haven’t seen anything like it. I think people are pretty excited about different ways to tell stories and represent people, especially on an indie level.

But this wasn’t tech for tech’s sake. It was a solution because we couldn’t work together in real life. And I don’t think driving the process with data will replace traditional character animation. I mean, Pixar movies are so beautiful, and they all have their own place.


Author

Meleah MaynardEscritora/Editora – Minneapolis, Minnesota