Using VFX to Tell Magical Stories Barnstorm explains the subtle and not-so-subtle VFX they created for "The Order".
COVID-19 seems to have brought many popular series to a premature end. Among the recent casualties was the Netflix horror-drama, The Order, which was cancelled late last year. Barnstorm VFX’s Vancouver team worked on season two of the series, using Maya, Houdini, Substance Painter, Redshift and more to create both elaborate VFX and effects that were so subtle they were invisible to most viewers. “It’s a cool show for VFX artists to work on because it’s so magical,” says Barnstorm’s Gregory Watkins, who served as the vendor-side VFX supervisor.
Founded by Cory Jamieson and Lawson Deming in 2011, Barnstorm VFX is known for its work on many high-profile series, including Ted Lasso, The Man in the High Castle, Silicon Valley and Project Blue Book.
We talked with Jamieson, Deming, Watkins and Chun Seong Ng, Barnstorm’s CG supervisor, about the studio’s work on The Order, as well as the team’s approach to complex productions.
Tell me about how Barnstorm got started and what makes your studio stand out.
Deming: Cory and I met when we were both independent VFX artists who mostly worked in-house on TV shows. After a couple of years, we decided to start our own company. We called it Barnstorm because, until then, we had been “barnstorming” on shows, and we wanted to pay homage to that time in our name. Also, Cory comes from a family of pilots, so it seemed fitting to have the aviation theme.
Once the company started growing, we moved to an office in Burbank. We’ve made a few more moves since then, and we also opened an office in Vancouver two years ago. I think we’ve got about 80 people in our LA office now, and we’re working on much larger TV shows, features and installations than we used to.
Jamieson: What makes us unique is having been on the show side in post with the editorial teams. We know how to put a show’s needs above our own. We always try to see things from a production perspective. It sounds cliché but it is really true, and it makes a difference in how creative conversations evolve. We realize that we are part of a broader creative vision, so what's right for us, or what makes for a flashy VFX shot, is not always what's best for the story.
Describe your workflow and how you approach projects?
Lawson: We are constantly looking for ways to adapt. We don’t have entrenched workflows and pipelines, and that nimbleness is very unique. I think that’s allowed us to punch above our weight class and get work on the cutting edge of VFX. In terms of our interaction on projects, we benefit a lot from our collective knowledge. There's this impression that people who work in VFX are all computer geniuses, or something, and we know how to do things immediately but, really, we spend a lot of time in meetings figuring things out.
Ng: It helps that we’re not a new team. Gregory and I come from a feature film background, and we worked together for six years at another VFX vendor. On top of that, I was also able to bring in some of my team members from my previous venture, so it makes the transition less challenging. My CG Lead, Stefan Schneider, and I started experimenting and integrating Redshift into production about three years ago, so we felt confident carrying it into our new pipeline. Other vendors in town are also using Redshift more, so it makes sharing assets much easier and straightforward.
Tell us about some of your favorite VFX from season two.
Watkins: We all enjoyed working on the show as there was a nice mix of invisible and magical VFX. We get a great deal of satisfaction from pulling off well-executed VFX that the audience doesn’t even suspect are there, but it’s also a blast to do the in-your-face effects required for a show where the characters delve into magical dark arts in their battle against werewolves.
Our team started each episode by doing a spotting and brainstorming session with Ryan Curtis, the production VFX supervisor, and production VFX producer Geena Renk. Usually, they had a pretty good idea of what they were looking for, but sometimes we were able to come up with new ideas together to make an effect even cooler. There was often a mix of photoreal type magic, as well as your glowy, ethereal effects.
In the first episode, for example, a spell is cast on a student who spontaneously combusts from the inside. Beams of hot light shoot out of his mouth, nose and eyes. We first thought we’d tackle it as a 2D shot, but we wanted to see the correct structure of his skull, jaw, muscles and veins, so it became clear that we needed 3D to push the look further. We match moved and rotomated a generic skull and head and then placed a spotlight inside.
The match was not perfect, but some warping in comp fixed that. Two versions with subsurface scattering were rendered with Redshift, one with the skull and one without, so the compositor could mix where needed. There was a lot of experimentation between the compositor, Anthony Peck, and CG Artist Nikhil Patil. Curiously, it seemed to work better to turn off the visibility of the eyes in the model. When the effect starts. the 3D teeth and inner cheek were bathed in light and clearly visible through the open mouth, which Anthony warped into place for a better fit in the live-action plate.
Another good example from episode one is where a student on campus has been turned into a stone statue. The statue tips over and breaks into chunks that transition from stone to flesh. Production created a very realistic dummy and shot a B plate of it laid out on the ground, broken into chunks of body parts, guts and gore. When we cut into the sequence, the student was already mostly turned to stone and you see the statue from various angles.
Since it’s a 4K show, we wanted to make sure there was a high level of detail and achieve the specific look production was after—a whitish-alabaster with the right amount of marbled veins. Production provided photo references of the actor standing in the statue pose, which the team modeled, sculpted and textured in Maya, ZBrush and Substance Painter.
Since the shot ends in a transition to the practical dummy, we had to achieve a decent match of the clothing and basic look of the character. The shaders and lighting were developed for Redshift which, thanks to its incredible render speed, allowed us to iterate often and quickly to get turntable renders and test comps in front of Ryan to inch the look closer and closer to what they envisioned.
Ng: We used Houdini for the dust simulation when the statue shatters. The most challenging thing was when the statue turned back into a person after the magic wore off. Instead of stone pieces, there were all of these practical body parts everywhere. It was pretty gory, and we had to mix some hand animation on top of the simulation to make the transition to the live action work.
Any other scenes you’d like to tell us about?
Watkins: For episode four, we were asked to create a vision of a future apocalyptic world that would take the place of the college campus featured in the show. The shot begins tilted down and focused on a bronze plaque lying half buried in windswept dirt, rock and bones. It’s recognizable as being from a monument that you see in the show, so viewers know it’s the campus. The moment is punctuated by the foot of a werewolf named Silverback, who stomps into frame and looks around before running off.
Due to the fantasy nature of the environment, we used CG entirely for the shot. We started with concept art to make sure we were in the ballpark of what Ryan had in mind. Our CG artist, Sang Myeon Park, used photo references and Google maps to help establish the basic topology of the immediate landscape and the layout of the main structures in Maya.
We knew that everything beyond the cliff could be accomplished with a matte painting, but to ensure it lived in the same world as the rest of the CG, the basic structures and landscape were lit and rendered in Redshift. That served as a starting point for the matte painter, who took the detail and atmosphere to the next level using Blender and Photoshop. The matte painting was projected on top of the original geometry in Nuke and rendered using the shot camera. so it tracked with the rest of the CG.
The number of polygons needed to achieve all that detail was insane, but Redshift munched through it like a champ. We were often able to get new versions of renders in the compositor’s hands within the same day on our modest local GPU farm. When production demands it, we can easily scale to Amazon cloud rendering, as we sometimes do, but with Redshift’s speed that is often not necessary with a dozen or so graphics cards.
What are you working on now?
We are currently working on several things: season two of Netflix’s Another Life, Big Sky on ABC and CW’s Superman & Lois.