Loading...

Follow VFX Voice Magazine - VisualEffectsSociety on Feedspot

Continue with Google
Continue with Facebook
or

Valid

By FABIO BONVICINI and ASHTON LOUIE

This spring, the world-famous Hellboy series was rebooted and reimagined in Neil Marshall’s Hellboy (2019) from Lionsgate and Millennium, and with it, a new horde of dangerous, gigantic, and dynamic creatures – all dead-set on destroying mankind. Celebrating 25 years of multimedia success, the Hellboy franchise has a long history of character success. And now, in the golden era of supernatural cinematics, the need to achieve even more lifelike results was an inevitable new production challenge.

Technicolor’s VFX studio Mr. X won the bid to deliver five of the film’s key monsters, including a bipedal boar known as the ‘Gruagach,’ a gang of weapon-wielding giants, and a human/wild cat ‘werejaguar.’ The creature experts at Mr. X knew that flexible, scalable and fast character tools would be needed to successfully deliver such an array of monsters in such a short time. The team chose to use Ziva Dynamics soft-tissue and physics-simulation software, Ziva VFX, to bring the characters to life.

In this article, we discuss some of the unique challenges of each Hellboy character and discuss how the Mr. X team leveraged Ziva Dynamics technology to overcome each production obstacle.

THE GRUAGACH

Final render of the muscular Gruguach.

The team kicked off their character work with the 2.5-meter-tall Gruagach. It took several months before the final Gruagach look was approved, but, ultimately, he took the form of a bipedal boar, with the physique of a human bodybuilder and the skin of a rhino. Mr. X built a full-body Ziva simulation, complete with an anatomical muscle system and a thick fat/skin layer, to bring the monster to life.

1. High Quality Muscles

Gruguach: Ziva muscle and bone simulation pass

The Mr. X modeling and rigging teams worked closely to achieve a lifelike muscle system that captured the intensity of the Gruagach. We weren’t really phased by the numerous look changes. We were able to quickly update the geometry in the Ziva simulations and see results within hours. So, even though there were changes in the creative direction, we only needed one iteration of the Gruagach sim for the final render – everything else was just minor tweaks for greater accuracy.

2. Unnatural Fat/Skin

Gruguach: Ziva fat/skin simulation pass.

The Gruagach’s fat and skin layers were then simulated in a single pass using the fat-wrapping tools released in Ziva VFX v1.4. This anatomy layer posed a unique challenge as it needed to be acutely thick, similar to that of a rhino hide, yet compatible with a fast moving, human sized body – an unnatural pairing. The team was able to manipulate the soft-tissue parameters in Ziva to tactfully increase the skin physics without compromising on other meaningful details, like sliding, mobility or muscle definition. Essentially, they achieved the desired result which was both unnatural in composition, yet completely natural looking.

THE GIANTS

Final render of the 8-ft.-tall Axe Giant (left) and Club Giant (right).

Next on the chopping block were the 8-meter-tall giants who attack Hellboy in a major 3-on-1 fight scene. This destructive, ogre-esque species is notorious for their enormous size and adorning themselves in metals scraps and rags. For this reason, the complexity varied between the three giants significantly. The Axe Giant and Sword Giant are both wearing approximately 150 pieces of cloth that cover the majority of their big bodies. Meanwhile, the Club Giant is almost entirely nude and has the longest consecutive animation sequence of all five characters, with over 1,050 continuous frames. The Mr. X team chose to commit a full Ziva simulation process to hit the complex and anatomy-intense Club Giant shots. Conversely, they augmented a cloth-focused workflow with underlying Ziva sims for the two remaining giants.

1. Rapid Anatomy Warped Muscles

Club Giant: Ziva muscle simulation pass with armor bone layer.

Recreating the biomechanics of massive creatures is challenging, even for the most seasoned artists. Aware of this common obstacle, the Mr. X team strategically circumvented all the guesswork by using Ziva’s built-in anatomy scripts. These developer tools let the team quickly transfer the existing Gruagach muscles to fit the colossal Club Giant mesh. Although the Club Giant is over 3x the size of the Gruagach, the warping scripts effectively transferred the entire muscle system, along with the full simulation setup, including attachments, muscle commands, parameter settings, and more. We were grateful that we could scale up the tissues so easily and achieve a strong starting point for the giants so quickly. This scalable approach made it possible to complete the complex Club Giant rig in less than one week!

2. Material Varying Fat/Skin

Final output of full anatomical Ziva simulation.

Individual fat/skin passes were then simmed for all three giants. The paintable Ziva Materials made it easy to make lifelike deformities on the giants. The large tumor on the top right shoulder of the Club Giant, for example, ended up having a Young’s Modulus of 21, which means that it has an elasticity/stiffness similar to gelatin and brain matter. Conversely, the rest of the monster’s fat/skin was 106, which is close to rubber. We really wanted the bulbous growths to look soft yet dense, so it couldn’t just hang like the rest of skin. Once we found the right material quantities, we plugged those in to Ziva’s material parameters and painted it on. It was really fast. Without Ziva, everything would have looked uniform and not nearly as realistic.

3. Cloth and Metal Compatibility

The final touches on the giants are their unique cloth and junkyard ornamentations. It’s critical that the weight of apparel, especially hard objects like armor, actively influence any soft tissue it comes in contact with. To replicate the effects of heavy chains and rope on the giants’ bodies, the Mr. X team simulated the hard objects as Ziva Bone objects and applied the Ziva IsoMesh functionality, so the articles would remain non-deforming while applying constant pressure to the underlying fat/skin.

WEREJAGUAR

Final render of Werejaguar in human-to-wild cat mutation scene.

The fifth and final simulated character for the Mr. X team was the human-turned-wild cat ‘werejaguar.’ The team primarily focused on the ominous mutation scene where audiences are first introduced to the werejaguar. In this shot, the character initially stood upright during his transformation, similar to the conventional werewolf form. However, as is tradition with most major motion pictures, the creative direction changed and, unexpectedly the team needed to adapt their biped to a quadruped to capture the beastly essence of the shot.

1. Adapting an Existing Rig

In a rush to the finish line, the Mr. X team opted to adapt the muscle rig from an existing were-creature that they had built for a previous blockbuster title. After updating the geometry, they simply turned the entire non-Ziva rig into a large group of Ziva Bone objects. Although this eliminated any anatomical muscle firing (since no muscles were actually used in the setup), the creature experts at Mr. X knew that this rapid approach would be compatible with the werejaguar’s permanently tense disposition in the scene.

2. Layered Skin and Bones

Werejaguar: Ziva fat/skin simulation pass.

The team then simulated a thin Ziva fat/skin pass, which would lie directly on top of the Ziva Bones. Once animated, the large bone layer naturally drove the custom dermal layer to produce the detailed secondary dynamics they needed. This unconventional use of Ziva Bones highlights the durability and versatility of the technology. This was the first time we ever turned a rig into one big bundle of bones, but we knew we needed to think outside of the box to hit the deadlines. We were very pleased with the result. The werejaguar looks strong and the skin looks very dynamic and stretchy, just the way you’d expect. It’s rare to find tools that let you have so much control while remaining faithful to the results.

3. Added Wrinkle Pass

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By IAN FAILES

An attendee tries out a VR installation at FMX 2019. (Images courtesy FMX. Photographers: Dominique Brewing and Luzie Marquardt.)

If you didn’t make it to FMX this year, don’t worry, we’ve wrapped up all the major presentations and events from the week. The conference, which hails from Stuttgart’s Filmakademie Baden-Wuerttemberg and Animationsinstitut, seems to go from strength to strength each edition. It reached 4,000 visitors during the week, and continues to be a destination that professionals and studios from not just Europe but all over the world attend.

FMX 2019 Highlights - YouTube
Watch FMX’s highlights video from the conference.

It’s not an academic conference, but with major drawcard speakers and sessions, FMX always offers up new insights into computer graphics and visual effects. There was a special track this year on light fields, for example, plus sessions on ray tracing and stylized lighting and rendering. You don’t really get to see talks curated like this anywhere else.

Getting a fix on the major releases

Audiences were hungry to catch the big presentations at FMX on the latest films and animated releases. This meant the rooms were full for talks on Avengers: Endgame, Captain Marvel, Alita: Battle Angel, The Meg, Hellboy, Christopher Robin, Dumbo, Mortal Engines, Solo: A Star Wars Story, First Man, Bird Box, Spies in Disguise, How to Train Your Dragon, and many other films and television shows.

The rooms were also full for special insights on digital humans, virtual production, ‘old-school’ visual effects chats, concept art, and pretty much anything on Spider-Man: Into the Spider-Verse. Indeed, if there’s one challenge FMX faces, it’s having enough space to host eager attendees. This was dealt with well, however, with overflow areas, and screens and an app that tell you which rooms are filling up.

The new ‘Get Together’ space at FMX was a popular end-of-day activity.

The theme for this year at FMX was ‘Bridging the Gap.’ This represented both areas of technology and access to the industry. Diversity has become an important part of the FMX conference.

New and cutting edge

It’s not an academic conference, but with major drawcard speakers and sessions, FMX always offers up new insights into computer graphics and visual effects. There was a special track this year on light fields, for example, plus sessions on ray tracing and stylized lighting and rendering. You don’t really get to see talks curated like this anywhere else. One other benefit was a very relaxed atmosphere to go up to the speakers afterwards to keep the conversation going.

In the immersive media (VR/AR/XR) space, several speakers and exhibitors showed off projects, with attendees able to don headsets or glasses to view them. Both Unreal and Unity each had a presence at the event, and real-time rendering was one of the topics discussed regularly all week. Magic Leap’s John Gaeta also offered up his thoughts on the future in this area, with a discussion of ‘the Magicverse.’

ILM’s Rob Bredow presented on Solo: A Star Wars Story and was part of several other talks and panels during the week.

For students and new entrants into the industry, FMX 2019 was a conference that offered a lot of access to professionals and companies. That came from the 28 companies involved in the job fair, from the 19 universities with booths there, and from several software vendors and VFX studios themselves having individual rooms with presentations.

Access and diversity

The theme for this year at FMX was ‘Bridging the Gap.’ This represented both areas of technology and access to the industry. Diversity has become an important part of the FMX conference. This year, Women in Animation presented a panel talk called ‘Focusing Female Firepower: The Path to Inclusivity’ with LEGO Movie 2 co-director Trisha Gum, Animal Logic’s Sharon Taylor Hahn Film AG’s Imke Fehrmann and Adventure Lab’s Kim Adams.

For students and new entrants into the industry, FMX 2019 was a conference that offered a lot of access to professionals and companies. That came from the 28 companies involved in the job fair, from the 19 universities with booths there, and from several software vendors and VFX studios themselves having individual rooms with presentations. This is one of the big changes in events like FMX and SIGGRAPH – where companies speak directly to artists. It seemed like a good way to find out exactly what actually happens inside a VFX or animation studio.

The ‘Focusing Female Firepower: The Path to Inclusivity’ panel.
Beyond the presentations

There was a lot to do at the conference this year, and also a lot to do away from it. Companies such as Maxon and Foundry had parties, while a major innovation turned out to be the ‘Get Together’ space, right out in front of the venue at the Haus der Wirtschaft. Make no mistake, German food trucks and beer were hugely popular (even with one major downpour of rain), and it provided FMX with a much-needed space just for socializing.

And as much as the event caters to students and new artists, speakers – of which there were more than 280 – and other professionals who attend found themselves in a casual environment to ruminate about the industry. That’s actually a major attraction in attending FMX – meeting and greeting other pros from around the world. At other conferences, often there’s not always the time and space to simply ‘hang out.’ It was clear this happened a lot in Stuttgart this year.

This wrap-up really only touches on some of what occurred at FMX (not to mention the associated event also in Stuttgart at the same time, the Stuttgart International Festival of Animated Film). It might be time to even start planning your attendance for next year. In 2020 – FMX’s 25th year – the event will be held from May 5-8.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By TREVOR HOGG

Pokémon stuffies and puppets were utilized on set for eyelines, composition and lighting references. (Images copyright © 2019 Warner Bros. Entertainment Inc. and Legendary Pictures) 

A consortium of Nintendo, Game Freak and Creatures Inc. established The Pokémon Company in 1998 to manage a global media franchise of pocket monsters that has spawned video games, trading cards, an animé television series, toys, books, music, merchandise, an augmented reality mobile game, and most recently a live-action feature directed by filmmaker Rob Letterman (Goosebumps) called Pokémon Detective Pikachu. Despite the effort to shoot on film and to get as much in-camera, the visual effects supervised by Erik Nordby (Passengers) were essential in enabling humans to interact with the various Pokémon characters brought to life by MPC, Framestore, Image Engine and Rodeo FX.

“Framestore and MPC art departments worked for about a year designing the Pokémon characters,” states Framestore Creative Director of Film Jonathan Fawkner. “The fight between making them look real or more like their original cartoon design was always a tussle. Stage two was about fleshing those ideas out with some movement studies. Stage three was producing pre-render turntables for fully-rendered CG assets. Each one of those stages needed to be signed off by The Pokémon Company in order to get characters approved.”

Jonathan Fawkner, Creative Director of Film, Framestore

Pete Dionne, VFX Supervisor, MPC

Footage was captured on film and in practical locations with London doubling as Ryme City.

“Framestore and MPC art departments worked for about a year designing the Pokémon characters. The fight between making them look real or more like their original cartoon design was always a tussle.”

—Jonathan Fawkner, Creative Director of Film, Framestore

Director Rob Letterman and lead actor Justice Smith discuss a scene while on set.

MPC VFX Supervisor Pete Dionne partnered with colleague Bryan Litson to conceptualize, create and execute 40 out of the 60 Pokémon characters made for the movie. “Pre-production was a busy time trying to push through this necessary volume of character design,” recalls Dionne. “Especially for MPC’s art department, which worked on the character concept development, and Character Asset Supervisor Dan Zelcs and his team, who then had to build them all!

MPC and Framestore created their own models of Detective Pikachu to suit the story needs of their scenes. 

“Pre-production was a busy time trying to push through this necessary volume of character design. Especially for MPC’s art department, which worked on the character concept development, and Character Asset Supervisor Dan Zelcs and his team, who then had to build them all!”

—Pete Dionne, VFX Supervisor, MPC

Cubone was one of the 60 Pokémon characters created for the movie

“We began the project with a tremendous amount of respect towards the original source material for the Pokémon designs, but also for the challenges that we faced in transforming them into living, breathing creatures in a photorealistic world,” remarks Dionne. “These adorable Pokémon characters all had the potential to turn grotesque quickly with the slightest design misstep, so we dedicated a lot of time, budget, and energy to get it right. For all of the successful Pokémon that you see on screen, we also have a few abominations buried deep in the cellar which will never see the light of day!”

The playing cards and animé television series were referenced when designing Psyduck.

“These adorable Pokémon characters all had the potential to turn grotesque quickly with the slightest design misstep, so we dedicated a lot of time, budget, and energy to get it right. For all of the successful Pokémon that you see on screen, we also have a few abominations buried deep in the cellar which will never see the light of day!”

—Pete Dionne, VFX Supervisor, MPC

Charizard and Omar Chaparro as Sebastian. 

A matrix was created for the hero characters based on their proximity to the camera and the level of articulation required. “If it was from A to C of resolution, they were going to have high-resolution textures and a lot of development in their shading,” notes Fawkner. “But maybe they didn’t have big story points, which meant that there wouldn’t be a lot of character development in the animation. But there were other characters like Pikachu where we had to work on him outside of shot context for a long time and look at the various activities that we thought made sense. Something as simple as having him sit at the bar. MPC led on that, but our Pikachu had a different service to the story than theirs did.”

Justice Smith as Tim Goodman, Detective Pikachu (Ryan Reynolds), and Ken Watanabe as Lieutenant Hide Yoshida.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By TREVOR HOGG

Images © 2018 Paramount Pictures. 

Ever since filmmaker Brian De Palma directed Carrie in 1976, there has been a long-standing love affair between Hollywood and prolific horror author Stephen King that has resulted in over 60 big-screen adaptations. For the second time, Pet Sematary gets the cinematic treatment under the direction of Kevin Kölsch and Dennis Widmyer, the duo previously responsible for Starry Eyes and Absence. Recreating the world where a mysterious forest burial ground in rural Maine reanimates the dead with demonic consequences required the assistance of Mr. X on 400 shots focusing mainly on digital set extensions as well as a dramatic road accident.

Mr. X VFX Supervisor Damien Hurgon was involved with the two months of prep along with the principal photography in Montreal before overseeing the post-production process, while colleague VFX Producer Mike DiCarlo facilitated the necessary logistics, budgeting and studio relations needed to complete and deliver work on time.

Mike DiCarlo, VFX Producer, Mr. X

Left to right: director Kevin Kölsch, producer Mark Vahradian and director Dennis Widmyer onset. 

A frequent invisible effect was creating the illusion that Jud Crandall (John Lithgow) lives next door the Creed family, because in reality the two houses were up the street from each other. 

“We expected a lot of work to do on the cat, which is one of the main characters. It actually ended up not requiring very much work. … The bigger effects were environment builds for the forest and burial ground which were a mix of 3D and 2.5D. It was interesting because we were trying to find a balance between this creepy supernatural world and photo-realistic live-action visual effects.”

—Mike DiCarlo, VFX Producer, Mr. X

Dealing with Church the cat was not as problematic as integrating the plate photography from two locations for the truck crash that kills the feline.  

“When starting the project, we expected a lot of work to do on the cat, which is one of the main characters,” notes DiCarlo. “It actually ended up not requiring very much work. We fixed a few continuity issues and made them more or less gory in a few places. The bigger effects were environment builds for the forest and burial ground which were a mix of 3D and 2.5D. It was interesting because we were trying to find a balance between this creepy supernatural world and photo-realistic live-action visual effects.”

No previs or postvis were created for the production because there was not a lot of CG character animation. “As Kevin and Dennis got into the editing of the film, we got a comprehensive brief on things,” explains DiCarlo. “They sent us some concept art, real-world reference and some pages from the original novel. Stephen King had written some clear descriptions of what some of these environments should look like. We took all of that and did some concept work which was sent back to them, had a dialogue, and eventually got to final version.”

A final composite of the Creed family home. 

Mr. X sought to have a balance between a creepy supernatural world and photo-realistic live-action visual effects.

“[Directors Kevin Kölsch and Dennis Widmyer] sent us some concept art, real-world reference and some pages from the original novel. Stephen King had written some clear descriptions of what some of these environments should look like. We took all of that and did some concept work which was sent back to them, had a dialogue, and eventually got to final version.”

—Mike DiCarlo, VFX Producer, Mr. X

A small bog was created on a soundstage, which was then digitally extended into a large environment.

Nighttime scenes have advantages and disadvantages. “A lot of detail has to be suggested in there because you don’t want a dark black frame,” states DiCarlo. “You need to have some nuance and depth in the environments. Nighttime movie lighting can be a balance that you need to find as well between a realistic scenario and something that you can still see.” Hurgon worked closely on set with Kölsch and Widmyer, Cinematographer Laurie Rose (Stan & Ollie) and Production Designer Todd Cherniawsky (Splice) to make sure that the visual effects could be integrated with the lighting and photography.

A lot of photogrammetry was taken of the sets and locations in order to reconstruct those environments digitally. “For some of them, like the pet cemetery where the final scene takes place, we had footage shot in a cemetery that had been built on a real location in a forest, and some matched to that on a stage,” states DiCarlo. “We had to reconstruct that environment exactly. We had a bit more latitude in the deep forest and burial-ground settings, which were shot on a stage. A lot of reference was captured in the real forest on location.

Photogrammetry was taken of the sets and locations in order to reconstruct those environments digitally.

Louis Creed (Jason Clarke) observes Indigenous warnings on the trees while Jud Crandall (John Lithgow) serves as his guide.  

“A lot of detail has to be suggested in [nighttime scenes] because you don’t want a dark black frame. You need to have some nuance and depth in the environments. Nighttime movie lighting can be a balance that you need to find as well between a realistic scenario and something that you can still see.”

—Mike DiCarlo, VFX Producer, Mr. X

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By WILL McDONALD

Barnstorm has hit 77x the compute power of its on-premises render farm with the cloud for shows like The Man in the High Castle. (Image courtesy of Barnstorm and Amazon)

When discussing workflows and pipeline architecture in the VFX community, cloud-based technology naturally enters the conversation. Significant advancements in hardware and software have made cloud-based technology more accessible for VFX professionals, but there are still a lot of unknowns and open-ended questions about how the cloud will impact VFX facilities in the short and long term.

Among the major cloud computing providers today are Amazon Web Services, Microsoft Azure, Google Cloud Platform, Adobe, Rackspace, IBM Cloud, Salesforce, Verizon Cloud, Red Hat, Oracle Cloud, and others.

VFX Voice talked to Will McDonald, AWS Thinkbox Head of Business Development, and addressed some of the most common questions about using the cloud for VFX to provide a baseline foundation for future cloud-based VFX workflows.

Will McDonald, Head of Business Development, AWS Thinkbox

The Good Doctor (Image courtesy of ABC and Amazon)

VFX Voice: Let’s say you have already invested in on-premises infrastructure. How would cloud-based technology benefit your studio?

McDonald: In my opinion, one of the truly transformational aspects of using the cloud for VFX is that it allows studios to be much more flexible and dynamic. Capital expenditure (CapEx) shifts to OpEx (Operational Expenditure) so studios don’t have to worry about hefty upfront costs or managing the logistics of renting or purchasing more machines in a crunch.

Most established studios have purchased a significant amount of hardware, and naturally they want to make the most of those investments. At the same time, there’s a deluge of original content being created resulting in more VFX work. By leveraging cloud-based resources, studios have the ability to take on far more and bigger projects than what they could handle with on-premises resources, while avoiding the upfront CapEx costs associated with physical infrastructures.

With a hybrid cloud model, studios can use existing resources until they hit a point where they need more, then scale into the cloud on a pay-as-you-go basis. And when those cloud resources are no longer needed, they spin back down to zero. It’s a truly elastic way of working, and many VFX studios have successfully used this methodology. For example, Tangent Animation rendered nearly a third of its animated feature Next Gen on the cloud; FuseFX has scaled their render farm 10x with the cloud for The Orville to hit tight deadlines and complete complex work such as a full space battle sequence; Milk Visual Effects used the cloud to scale its render farm 10x to create stormy ocean sequences for Adrift; and Barnstorm has hit 77x the compute power of its on-premises render farm with the cloud for shows like The Man in the High Castle.

More than rendering, the cloud can support an entire studio ecosystem. Virtual workstations can be used to allow remote artists or temporary hires to start working immediately without procuring hardware or finding space to deploy it, or even provide mobility for an entire studio. Using cloud-based storage, at least with AWS, hedges against unforeseen disasters such as floods, fires or hardware failures since the data is backed up redundantly in multiple locations within a given availability zone.

The scale afforded by the cloud is beneficial to studios of all sizes, but mid- to small-sized studios stand to gain a lot with its elasticity. They can work on larger projects, and, in many cases, bid on more projects than their on-premises infrastructure could handle. This applies even to the individual freelancers who can now tap into a vast number of vCPUs, GPUs and RAM that only large studios could previously afford.

The Man in the High Castle (Image courtesy of Barnstorm and Amazon)

“By leveraging cloud-based resources, studios have the ability to take on far more and bigger projects than what they could handle with on-premises resources, while avoiding the upfront CapEx costs associated with physical infrastructures.”

—Will McDonald, Head of Business Development, AWS Thinkbox

VFX Voice: How do I evaluate whether using the cloud is a good fit for my facility?

McDonald: In addition to its elasticity, cloud-based technology is generally fast and cost-effective to test new ideas, workflows and infrastructure configurations. You can experiment with different approaches in minutes and only pay for the time spent testing. Integrating with cloud-based resources can typically be done using command line tools, scripting and programming languages, and a GUI. Additionally, tools can automate the process of building and modifying cloud infrastructure.

VFX Voice: How can VFX facilities forecast and budget for cloud spending?

McDonald: While cloud compute costs vary since cloud options are so scalable, keeping budgets on track ultimately comes down to hitting production goals. With AWS, there are three main resource options: Reserved Instances, On-Demand Instances and Spot Instances. Reserved Instances are for steady state workloads where virtual machines are purchased for long durations ahead of time, up to three years at a time; On-Demand Instances are virtual machines that are spun up only when needed; and Spot Instances are highly economical resources, but pre-emptible. Cloud rendering becomes very cost-effective when a studio is able to primarily leverage Spot Instances.

Cloud pricing is determined by time, so running 1,000 CPUs for 10 hours or 10,000 CPUs for one hour costs the same; however, getting results faster leads to higher productivity since iteration is quicker, thereby increasing output quality. In my experience, studios that expect to double render farm capacity by expanding to the cloud end up scaling 10x or beyond for shorter durations because the costs are about the same. The more you become familiar with the cloud and understand how it functions, the more accurately you can forecast values and build cloud compute costs into VFX project bids or pass them through to clients.

Milk Visual Effects used the cloud to scale its render farm 10x to create stormy ocean sequences for Adrift. (Image courtesy of Milk Visual Effects and STX Films)

Adrift (Image courtesy of Milk Visual Effects and STX Films)

Adrift (Image courtesy of Milk Visual Effects and STX Films)

Adrift (Image courtesy of Milk Visual Effects and STX Films)

“The more you become familiar with the cloud and understand how it functions, the more accurately you can forecast values and build cloud compute costs into VFX project bids or pass them through to clients.”

—Will McDonald, Head of Business Development, AWS Thinkbox

Adrift (Image courtesy of Milk Visual Effects and STX Films)

“With a hybrid cloud model, studios can use existing resources until they hit a point where they need more, then scale into the cloud on a pay-as-you-go basis. And when those cloud resources are no longer needed, they spin back down to zero. It’s a truly elastic way of working, and many VFX studios have successfully used this methodology.”

—Will McDonald, Head of Business Development, AWS Thinkbox

VFX Voice: How does leveraging the cloud change how artists work?

McDonald: Usually, cloud rendering resources are exposed to the end user the same way as local render nodes, so except for the naming of the..

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By IAN FAILES

The characters leap from the building as a bomb is detonated behind them. (All images copyright © 2019 Fox)

The Season 3 finale of the Fox series Lethal Weapon features an incredible explosive sequence as the character Wesley Cole (Seann William Scott) and his ex-wife Natalie Flynn (Maggie Lawson) find themselves having to leap out a window from a building’s 34th floor – onto a wrecking ball – as a bomb goes off behind them.

The dramatic scene made use of live-action stunt photography at a Downtown L.A. apartment building, greenscreen elements of the hero actors, practical explosions and visual effects by CoSA VFX. VFX Voice asked the stunt, special effects and visual effects supervisors how they worked together to make the shots possible.

Storyboard frame by storyboard artist Michael Bayouth.
PLANNING A HIGH-RISE STUNT

Mark Spatny, Visual Effects Supervisor: “One of the guiding principles of the show is that if they can do something practically, they will. They’ve never once asked the question, ‘Would it be cheaper to do this visual effects? Would it save us time?’ Their thing is, can we do it practically? If so, then they’re going to do it. And if not, that’s when they call VFX in. By doing as much practical as they can, they stay true to the feel of an ’80s action movie, which is really important to everyone involved.”

Brendon O’Dell, Special Effects Supervisor: “What the script called for were two separate scenes, where one was a guy was being thrown out of a broken window. The other one was with Cole and his ex-wife; he shoots the window and then they jump out of it as the bomb goes off. Right off the bat we had to tackle how are we going to get that look of them going out of the real windows.”

Mark Spatny: “There was a storyboarding process, and then we went through the boards very carefully, talking about, ‘This shot is a practical shot, this is the rig for it, this is what’s going to have to be removed in post, here are the shots that are going to be greenscreen, what plates do we need, how are we going to shoot them, we’re going to have to shoot them from a drone’ – all those kinds of things.”

Tim Trella, Stunt Coordinator: “That sequence was tough just because we also had to rig the rooftop. We had to put all the wires 40 stories up because when they jumped out the window, the two doubles – the Natalie double and the Cole double – had to have cables on them. So we had to make a system where we could hang and suspend them without dropping them and then retrieve them. When we did it we went through a practical piece of glass that was blown before they go through it, so it looks like they’re pushing the glass right out.”

The greenscreen shoot on the Warner Bros. backlot. This shot features stunt doubles for Seann William Scott and Maggie Lawson.

Close-up on actor Seann William Scott during the greenscreen photography.
STAGING THE SEQUENCE

Mark Spatny: “One of the more complex parts that you might not think about that had to be planned was they had to pull the real windows out, which involved all kinds of specialized people and equipment.”

Brendon O’Dell: “On screen, you see the glass break apart into a million pieces, which is the tempered look. What we did was, we took their glass out, we put ours in, and then tinted it to resemble the look from the outside of the building. But this was actually a functional building, a real building that people live in. It’s a super high-end apartment building. It took the building engineers to approve it, first of all. And it required us asking a professional window company to come out on the window-washer rigs and take out the windows so that we could put our tempered glass in, our special tempered glass that we break for special effects.”

Tim Trella: “My rigging guys worked for three days straight on the rooftops in the pouring rain because it rained the days that we rigged it. We had probably 56 feet of truss up there and winches and cable. And then we hung a weight bag exactly the weight of each person and swung it out to see where it goes and to make sure the system’s safe, and make sure we do what we have to do until we put our stunt doubles out there.”

Brendon O’Dell: “My part of it in practical effects was to allow us to get the interaction of the stunt doubles breaking through real glass. We pyrotechnically broke the glass right before the stunt people hit it so that it gives you that shattered look of all the glass raining down.”

Tim Trella: “I was also the second unit director for the sequence. We had eight cameras going and a drone in the air filming. We rehearsed a couple times with the weight bag, and once everybody had seen how it would work and all the cameras were set, we actually put the action doubles on there and did it for real.”

A crane holding a wrecking ball turns out to be a saving grace for the duo. This shot shows the stunt doubles making the leap.

A final shot in which digital doubles were used to show the characters in peril.
CRAFTING THE FINAL SHOTS

Brendon O’Dell: “To get the part where they’re hanging from the wrecking ball, we went to the Warner Bros. backlot and that was where a fake mocked-up wrecking ball was suspended against greenscreen. In special effects, we did a lot of air movers and wind to simulate them hanging and jumping onto the wrecking ball. Then the stunts department had a system rigged to continue the shot out of them jumping out practically to them landing on the wrecking ball.”

Tim Trella: “Basically what we had was a crane with the wrecking ball on top of that. And then had the actors for Cole (Seann William Scott) and Natalie (Maggie Lawson) wired up on what you call a travelers system which is a high line above them, almost like a zip line, out of the shot. And then we control them on winches to pull them forward to match the shots of the doubles jumping out of the window.”

Mark Spatny: “One of the big challenges was Natalie’s open-back dress – the harness was fully exposed on that. So we actually had to replace her back in a bunch of shots because the harness was visible there.”

Tim Trella: “Sean and Maggie did a phenomenal job. They did a hell of a job hanging out there on wires. You’re in a harness, hanging up above the crew the whole time for five or six hours. It can be grueling. If they weren’t actors, they’d be incredible stunt people.”

Brendon O’Dell: “The explosion fireball you see as they jump out of the window was done on that backlot of Warner Bros. I did a series of fireballs as elements for VFX against greenscreen that were used to build into the shot.”

Mark Spatny: “The explosion is a combination of real elements and 3D. What’s right behind them is the real fireball that was shot, but then as we’re tapering off to the sides where the windows are exploding, what’s behind the exploding windows is CG because we needed the interaction with the lighting and everything on the glass. So ultimately it’s a combination of 3D debris, 3D glass, practical pyro and CG pyro.”

Brendon O’Dell: “Between stunts, special effects and visual effects, there’s always a great contribution from everyone. We always discuss the best ways to be able to achieve all our effects. So it’s a very close relationship.”

The post Stunts, Special Effects and VFX: Anatomy of a LETHAL WEAPON Shot appeared first on VFX Voice Magazine.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By TREVOR HOGG

Ben Mendelsohn wearing prosthetic makeup portrays Skrull leader Talos, who has the ability to shape-shift into other beings. 

Having previously worked on Thor: Ragnarok and Avengers: Infinity War, the Marvel Cinematic Universe was not unfamiliar territory for Digital Domain DFX Supervisor Hanzhi Tang. However, having to deal with an invading shape-shifting alien race featured in Captain Marvel was not routine.

The Skrull transformation was much more creatively challenging than the normal work that we do for a Marvel movie,” Tang recounts. “Usually, you can point to a sky or explosion that you like, but as far as the Skrull transformation it was a creative blank sheet. There is no real-life example. The most difficult part of it was trying to think of transformations in new and interesting ways.”

Inspiration for Captain Marvel co-directors Anna Boden and Ryan Fleck (Half Nelson) came from the iconic prosthetic makeup transformations featured in the horror classic American Werewolf in London (1981). “It was a painful and visceral experience,” notes Tang. “When we shot the original plates, the directors gave Ben Mendelsohn (who portrays Skrull leader Talos) guidance to make it feel like he was trying to physically manifest this being. Then on the effects side we tried to have the skin changing color like a chameleon and have more physical processes on the surface. The skin is splitting, something is emerging from beneath, and reforming into whatever the new flesh is going to be.” To avoid going directly from the green skin of the Skrull to the pigment of the copied being, a white gel oozes out of the splitting skin. “We toyed with the idea of when a squid changes from blue to orange, and how those chromatophores expand and contract to give you a new color.”

Digital Domain DFX Supervisor Hanzhi Tang

“The Skrull transformation was much more creatively challenging than the normal work that we do for a Marvel movie. Usually, you can point to a sky or explosion that you like, but as far as the Skrull transformation it was a creative blank sheet. There is no real-life example. The most difficult part of it was trying to think of transformations in new and interesting ways.”

—Hanzhi Tang, DFX Supervisor, Digital Domain

Co-directors Anna Boden and Ryan Fleck on the beach with Ben Mendelson discussing the hero transformation between him and a surfer girl.

A change of physical volume needed to be taken in account as the size, weight and shapes of those being copied varied. “Looking at some Skrull background in the comics, there are some certain rules that have already been set up,” remarks Tang. “They can change volume 25%. I don’t think a Skrull can turn into a mouse! For the most part, we tried not to make it too weird for our characters. We kept certain things like the eyeline the same within the transformation and had to figure out what to do with the clothes. Are the clothes part of the character or something else? The clothes would disintegrate and reappear through a different process than the body itself. It was meant to be a technological, not a biological thing.”

Several weeks were spent looking at everything from lobsters molting their shells to crustaceans changing colors, to high-speed footage of popcorn popping, which served as an example of skin splitting and an interior volume bursting out. “We chose our favorite clips and presented those to Anna and Ryan as inspirations, and they choose the things that would be cool if we combined them together,” explains Tang. “Out of the gate, Anna and Ryan were like, ‘Make us feel gross.’ Initially, we did not put any limits on the gore because we could always dial it back. If something was too bloody, you could change the color or not have stuff dripping off so it’s cleaner looking.

The facial groove lines on the Skrulls guided the transformation process.

A white gel oozes out of the splitting skin to make transition of the skin pigment appear to be more natural. 

Different parts of the body transform at different rates to prevent it looking like a wipe going from head to toe. 

“It was a painful and visceral experience. When we shot the original plates, the directors gave Ben Mendelsohn (who portrays Skrull leader Talos) guidance to make it feel like he was trying to physically manifest this being. Then on the effects side we tried to have the skin changing color like a chameleon and have more physical processes on the surface. The skin is splitting, something is emerging from beneath, and reforming into whatever the new flesh is going to be.”

—Hanzhi Tang, DFX Supervisor, Digital Domain

“We started with good scans of the actress (playing the part of the surfer girl) and a CG version of Talos, so we had our A and B sides of it,” explains Tang. “For the textures, geometry and transformation, we went through hundreds of different simulations. There was a lot of exploration of every kind of procedural texturing or noise, as well as layers and layers of various bubbling effects. The directors liked the idea of the skin splitting along the natural groove lines on the faces of the Skrulls. That helped to guide us as to how this was going to work. In the beginning we tried to figure out if there was a logical process that happened. Whichever character the Skrulls were in the movie, they would go through the same stages of the transformation.”

Complicating rigging and animation was the fact that characters did not freeze while the transformation takes place. “There is this whole rigging and animation system to go along with it that changes size and shape while animating at the same time,” states Tang. “There are some complicated camera moves especially for the scene with the surfer girl. The background is all ocean so you don’t have fixed points that are easily tracked. It was a tricky 90-degree move with the background all sky and moving water.  We tried to put tripods in the water, but nothing stayed where it was supposed to be.  There was a lot of eyeballing that we had to do to try to get that locked down and also track the body movement of both actors.”

The clothes of the Skrulls disintegrate and reappear through a technological rather than biological process.

“The entire crew was on the beach the whole day and the weather was amazing. There were about a dozen transformations. Most of them had a single character transforming into another. There was one shot where four people are transforming at the same time. On top of that it was a shared shot with Framestore. They were doing the background and we were looking after the foreground. There were some logistical challenges for us. Seeing the hero transformation when Talos transforms into the surfer girl was the highlight of our work.”

—Hanzhi Tang, DFX Supervisor, Digital Domain

The speed of the transformation did not occur uniformly throughout the body. “A lot of work went into the staggering of the skin splitting,” states Tang. “Different parts of the body would transform at various times to keep it alive and interesting.” Several transformations occur throughout Captain Marvel. “We have one long hero transformation early in the movie where you see the whole thing happen. The following transformations needed to fit a quarter of the time that we had for the first one. When you squeeze everything down to 30 or 40 frames it becomes difficult not to make the transformation look like a wipe going from head to toe. That’s where staggering all of the different effects helped to make it still visible and tangible.”

“It was great working with Marvel Studios VFX Supervisor Christopher Townsend (Guardians of the Galaxy Vol. 2) who was collaborative,” remarks Tang. “The shot was only 10 minutes from our studio in Los Angeles. The entire crew was on the beach the whole day and the weather was amazing. There were about a dozen transformations. Most of them had a single character transforming into another. There was one shot where four people are transforming at the same time. On top of that it was a shared shot with Framestore. They were doing the background and we were looking after the foreground. There were some logistical challenges for us.” Tang adds, “Seeing the hero transformation when Talos transforms into the surfer girl was the highlight of our work.”

The post On the Beach with Digital Domain and Shape-Shifting Aliens in <strong>CAPTAIN MARVEL</strong> appeared first on VFX Voice Magazine.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By IAN FAILES

The user interface for Mill Mascot, which allows users to transfer their facial and hand performances onto a CG character in real time. (All images courtesy of The Mill)

We’re reaching a time when traditional animation, CG techniques, virtual production, real-time rendering and motion capture are all coming together in useful tools for artists in visual effects and animation. Indeed, we may already be there, as visual effects studio The Mill’s new Mill Mascot system shows.

Mill Mascot is a toolset that allows artists to puppeteer CG characters through hand and facial gestures. Making that possible is a connection of real-time game engine tech, animation tools and motion sensors. The idea is that directors, clients and artists can ‘jump in’ and get creatively involved in the development of a character themselves – live.

VFX Voice asked The Mill about how the Mascot system was developed, how it works and what it has been used for so far.

Hand gestures in Mill Mascot translate to the character in real time.
RAMPING UP ON REAL-TIME ANIMATION PERFORMANCE

Mill Mascot began life at the studio as The Mill’s ‘real-time animation system,’ or RTAS, bringing together software and hardware tools aimed at enabling live animated production. The results were a number of in-house Mill demos and some ‘Vonster’ spots for client monster.com, which saw a hairy creature puppeteered to perform human-like antics.

“The rudimentary beginnings of Mill Mascot allowed characters to be controlled in the very traditional sense of puppeteering by only moving their mouths open and shut,” outlines Jeff Dates, Creative Director at The Mill New York. “We also had to be aware of character design in the early stages, as the system allowed for minimal movement of the body and extremities.”

Since that time, says Dates, Mill Mascot has evolved in its abilities in significant ways. “It now allows for a great deal more flexibility and creativity. Characters are far more dynamic in their look, and the system generates final imagery at a much higher fidelity. We also have much finer control over the movement, from the entire body and extremities down to the eyelids and nuanced facial expressions. This has also greatly been improved by the presence of facial tracking in addition to hand controls.”

Behind those abilities is a combination of Epic’s Unreal Engine, Leap Motion for hand tracking, and an iPhone X for facial tracking. However, The Mill continues to develop and iterate the system so that it will ultimately be software and hardware agnostic.

The Mill New York Creative Director Jeff Dates.

“The [Mill Mascot] system behaves much like a live-action puppet. We’re able to introduce performance controls for facial emoting and hand dexterity for the character. The ability to generate useful animation is limited only by what the character is able to do. So in reality, we found that once the character is ‘active’ and being performed, it’s as alive as any puppet. So we had literally hours of animation from our recording session. Not to mention real candid ‘outtakes.’”

—Jeff Dates, Creative Director, The Mill New York

It’s all part of a push at The Mill to approach visual effects and animation work differently. The studio has the benefit of working on a range of different creative projects, from commercials to shorts to motion graphics and animated pieces. In terms of real-time, The Mill has been at the forefront of this technology, having partnered with Epic on a virtual production demo called ‘The Human Race.’ Here, The Mill’s tracker-covered Blackbird car was utilized to deliver car commercial-like shots for a Chevrolet vehicle demo. A virtual car was composited live over the Blackbird, and could be re-skinned in real-time.

This and other real-time projects are designed to provide flexibility and choice for directors and clients. A much more traditional shooting, CG and animation pipeline requires time for iterations. Real time is intended to allow for quick changes in choices, and a chance to explore more options.

Importantly, The Mill isn’t aiming, just yet, for a full-body motion-capture solution with Mill Mascot. Clearly there are systems capable of doing that incredibly well. Instead, Dates makes a distinction between real-time performance animation and real-time motion capture, with Mill Mascot intended to be a tool that can drive a fun character performance, not necessarily one completely grounded in reality (even though that ability does exist).

The Mill’s Blackbird vehicle, as filmed during ‘The Human Race,’ a virtual production collaboration with Unreal Engine.

The ‘Vonster’ project was one of the first commercial uses for The Mill’s real-time animation performance system.

“At The Mill we use theatre-trained puppeteers and real-time animation artists to inject human essence and personality into our characters, but anyone can take the reins.”

—Jeff Dates, Creative Director, The Mill New York

MILL MASCOT IN ACTION

The ‘Vonster’ spots were some of the first glimpses into what The Mill’s real-time animation system was capable of. Other instances with live-animation performances were seen in ‘Seanna’s Ocean Buddies,’ a Sesame Studios animated short. Then, at a recent conference for HPE (Hewlett Packard Enterprise), The Mill devised a real-time interactive installation with Mascot.

“After creating the CG mascot for HPE’s ‘Tame The IT Monster’ TV campaign, they wanted to re-purpose the asset and use Mill Mascot in order to interact with their audience at the HPE Discover Conference in Las Vegas,” recounts Dates. “We created three activations using Mill Mascot in Vegas. First, the IT Monster performed live on stage, interacting with the HPE CEO during his Keynote presentation and ‘interrupting’ his speech. Because the character was being performed live by puppeteers using Mill Mascot backstage, the IT Monster could also interact with the audience, proving the content wasn’t pre-recorded.

“Additionally, we built two custom booths on the conference floor, one that allowed attendees to play with and interact with the IT Monster, and one that allowed them to puppeteer and animate him themselves. It was hugely rewarding to see people interacting with Mill Mascot on such a large scale and the client was extremely happy.”

Controlling the Vonster monster with hand gestures.

“Using a system like this really allows [clients] to be a part of the process. Not only can they see the action, but they can also direct and give feedback live, and in some cases perform the characters themselves. This adds a whole new level of accessibility to the art.”

—Jeff Dates, Creative Director, The Mill New York

HOW THE SYSTEM WORKS

Dates has been the driving force behind Mill Mascot, a project he says came out of the challenges of many projects he’d worked on thus far. “I’ve always been a storyteller. One of the frustrating things for writers is getting work produced, so I came up with the idea as a way to produce my own stories and sketches to try out new material. If you are familiar with the saying, ‘Fail fast, fail often,’ well, Mill Mascot allowed me to get ideas out of my system and onto the screen quickly. Animation became less precious, and experimentation became the currency.”

To get a character working in Mill Mascot, artists start by generating a high-quality CG character asset using traditional modeling, rigging and texturing tools they are already familiar with. Then they can import the character into Mill Mascot, connecting it to the gestural controls (the Leap Motion controller and iPhone X) and allowing it to be animated and rendered in real-time via Unreal Engine.

“The system behaves much like a live-action puppet,” explains Dates. “We’re able to introduce performance controls for facial emoting and hand dexterity for the character. The ability to generate useful animation is limited only by what the character is able to do. So in reality, we found that once the character is ‘active’ and being performed, it’s as alive as any puppet. So we had literally hours of animation from our recording session. Not to mention real candid ‘outtakes.’”

Consideration of the gestural controls is important to then allow a performer to use hand and facial movement to drive the character. “Here at The Mill,” says Dates, “we use theatre-trained puppeteers and real-time animation artists to inject human essence and personality into our characters, but anyone can take the reins.”

So far, the kinds of characters created have appeared as final animation in advertisements and also in live audience situations. But Mill Mascot can also be used as a previs tool or for exploring performances, either by animators within the studio or directors and clients demonstrating what they would like to see.

Live view, as the creation of the Vonster monster comes together.

“This doesn’t make traditional animation redundant, it just provides a new way in which we can animate characters in a world that demands fast and interactive content all the time.”

—Jeff Dates, Creative Director, The Mill New York

WHAT IT MEANS FOR CREATIVES

So what does Mill Mascot offer for directors, clients and artists that is different from the usual way they bring CG characters to life? Dates suggests that often clients do not know or understand what goes on behind closed doors in the world of visual effects.

“However,” the creative director notes, “using a system like this really allows them to be a part of the process. Not only can they see the action, but they can also direct and give feedback live, and in some cases perform the characters themselves. This adds a whole new level of accessibility to the art.

“The key benefit for all parties is creative flexibility,” continues Dates. “With the ability to treat character animation more like a shoot, capturing content and performance live, directors and animators can make creative decisions on the fly, develop new content and ideas as they go, and even capture ‘bloopers.’”

Will it take away the jobs of animators? Not at all, cautions Dates. “This doesn’t make traditional animation redundant, it just provides a new way in which we can animate characters in a world that demands fast and interactive content all the time.”

Artists utilize traditional modeling tools to get their CG characters ready for Mill Mascot.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By IAN FAILES

Student assassins enter Las Vegas on an acid trip.

The SYFY series Deadly Class, developed by Rick Remender and Miles Orion Feldsott, is about a private academy where students train to become assassins. That premise gives rise to plenty of extraordinary imagery throughout the show, including a drug-filled assassination journey to Las Vegas during episode 5, “Saudade.” It’s here that the characters experience a full-blown acid trip, represented onscreen by a host of weird and wonderful imagery.

That imagery was crafted via a mix of visual effects, 3D animation and 2D anima-tion. VFX Voice talked to some of the key contributors to find out how the trippy sequence was crafted.

Deadly Class S01E05 Clip | 'The Acid King' | Rotten Tomatoes TV - YouTube
Watch the Las Vegas acid trip in this clip.
WHAT HAPPENS IN VEGAS…IS CRAZY

The arrival in Las Vegas begins with a car journey past classic neon signs that blend into a swirl of playing cards, poker chips, floating bubbles, a leaping leprechaun and plenty of other psychedelic imagery. Inspiration came from the Deadly Class comics (written by Remender) and the script (written by both Remender and Felsott).

“With Rick’s and director Adam Kane’s input, we employed FuseFX here in Vancouver to work on concept art in what became scene 15, the drive along the Vegas strip,” details Deadly Class Visual Effects Supervisor Mark Savela. “Rick writes very specifically – things like, ‘Vegas starts to melt’ and ‘kaleidoscope,’ and he also wrote about the leprechaun who turned into Ronald Reagan.”

The dazzling array of imagery included playing cards and poker chips.

“[Show co-creator] Rick [Remender] writes very specifically – things like, ‘Vegas starts to melt’ and ‘kaleidoscope,’ and he also wrote about the leprechaun who turned into Ronald Reagan.”

—Mark Savela, Visual Effects Supervisor

“We said to the artists,” adds Savela, “‘Come up with the wackiest, craziest stuff that you can, and it probably won’t be far enough.’ And a lot of times people were coming up with weird, trippy drug imagery and we would get it and go, ‘Okay, cool, let’s move further with that.’ It became a thing where we wanted to really challenge the artists and really let them play in their medium. And a lot of stuff that came back was amazing right from the get-go.”

While that acid trip arrival into Vegas is the centerpiece, the entire episode included a number of visual effects sequences –198 shots in total – that would be shared between FuseFX, CVDVFX, Zoic and One. Six One Eight. “We sent the work out for bidding early,” relates Deadly Class Visual Effects Producer Kerrington Harper. “People started turning over versions of the shots quite quickly, which was really helpful.”

A leprechaun was one of the CG creations made for the sequence.

“We said to the artists, ‘Come up with the wackiest, craziest stuff that you can, and it probably won’t be far enough.’ And a lot of times people were coming up with weird, trippy drug imagery and we would get it and go, ‘Okay, cool, let’s move further with that.’ It became a thing where we wanted to really challenge the artists and really let them play in their medium. And a lot of stuff that came back was amazing right from the get-go.”

—Mark Savela, Visual Effects Supervisor

Vancouver stood in for Las Vegas for the live-action plates in the episode. Shooting in October meant there was a high risk of constant rain in the British Columbia city. “But,” says Savela, “we got so lucky on the shoot, and it was actually uncharacteristically sunny for the whole time we shot. Then we finished the last outdoor shoot, and we went into the studio, and the minute we loaded into the studio it started raining, and I don’t think it stopped raining for the next three weeks!”

ANIMATION TRIP

In addition to visual effects imagery for the Vegas acid trip, animation studio Polyester was brought on board to deliver a sequence of 2D and 3D animation representing what the characters were hallucinating about. Polyester started with the script and broke the sequence down into eight sections.

The first animated segment was CG and represented the car passengers.

“The script itself was so crazy that we really almost put the script on steroids and had to decide how weird can we make this weird script,” adds Dimmock. “We’d say, ‘Well, what happens if the kids were not just blindfolded but they’re missing teeth and then what happens if we go through the mouths of the kids?’ and we just kept kind of pushing it as far as we could.”

—Jeremy Dimmock, Creative Director, Polyester

“From that,” explains Polyester Creative Director Jeremy Dimmock, “we started researching different styles for each section. We put together a massive amount of mood boards for each section and then bounced those off the show creators and the producers to see what was working for them and what wasn’t working. It also allowed us to bounce possible color palettes for each scene off them. And it gave us kind of a road map for establishing the look as we went forward into production.

“The script itself was so crazy that we really almost put the script on steroids and had to decide how weird can we make this weird script,” adds Dimmock. “We’d say, ‘Well, what happens if the kids were not just blindfolded, but they’re missing teeth, and then what happens if we go through the mouths of the kids?’ and we just kept kind of pushing it as far as we could.”

Psychedelic 2D animation helped tell the acid trip story.

“Everybody was just so excited to have such creative freedom on this.”

—Robyn Smale, Producer, Polyester

Polyester had five or so months to work on the animation. “Everybody was just so excited to have such creative freedom on this,” states Polyester producer Robyn Smale. “Everybody was really pretty great with staying on target and doing what they needed to do because everybody was so excited to see it come together. There is such an enormous amount of talent from the team we had working on this, and they each brought something so awesome to the table.”

Those separate sections utilized different animation and textural techniques, with each scene using a different tool. “That’s really what gives it a very different style and look for each scene,” says Dimmock. “The first scene is pretty much traditional Maya for animation and then rendered in Redshift with some modelling in ZBrush. The second scene was straight Toon Boom style animation. The third one is cel animation before we went back into 3D with Cinema4D. Scene six combined C4D with cel animation. I mean, there was everything. Every shot and every scene was totally different, and every palette was different, and that was kind of the overall instruction – to change as much as we could in between each of the scenes.”

Part of the darker playground section of the animated sequence.

One of Dimmock’s most memorable sections involved the playground scene, a scene that has a 2D noir style but was crafted in 3D. “We had just done a couple of scenes in 2D, so we took a stab on 3D, and it really captures the mood. At that point in the trip it goes dark – super dark – really quick. We were impressed that we could actually capture how emotional, heavy, dark and eerie the trip would be.”

The post <strong>DEADLY CLASS</strong> Goes on a Trip with VFX and Animation appeared first on VFX Voice Magazine.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
March 20, 2019
The Spring 2019 Issue of VFX VOICE is here!

Read all the articles from the Spring 2019 issue of VFX VOICE! (VFX Voice)

The Building Blocks Behind THE LEGO MOVIE 2

Animal Logic on the technological leaps created specifically for THE LEGO MOVIE 2: THE SECOND PART (VFX Voice)

DNEG Expands TV Operations in Vancouver and Montreal

DNEG is expanding its BAFTA-winning TV VFX offering in North America to meet growing demand for its services. (DNEG)

Laika Animation Wizards Talk Studio’s Most Ambitious Film Yet, Missing Link

Missing Link writer-director Chris Butler, producer Arianne Sutner and the heads of department at Laika Studios are applying the finishing touches to the animation studio’s fifth film (Screen Daily)

Vancouver’s Animation and Visual Effects Industry Will Be Worth $1 Billion in 2019

2019 is anticipated to be a major milestone year for Vancouver’s animation and visual effects industry, as the sector is on track to crack the $1 billion mark for the first time. (Daily Hive)

Epic Games Chooses Final Round of Unreal Dev Grant Recipients

Epic Games announced the final recipients of its Unreal Dev Grants program, totaling $500,000 in financial assistance for developers working with Unreal Engine 4. (Variety)

Foundry’s Innovative Offering to Creative Industries Attracts New Owner

Foundry announces that it will be acquired by Roper Technologies, Inc, a diversified technology company and a constituent of the S&P 500, Fortune 1000, and the Russell 1000 indices (Foundry)

The Third Floor Announces Story Attic Content Initiative

Leading visualization studio launches online showcase for original science fiction and fantasy-themed digital comics, illustrations and videos created by leading comic book artists and illustrators. (Animation World Network)

Google Spotlight Stories Shuts Down

Google Spotlight Stories, the experimental unit of Google’s Advanced Technology and Projects group (ATAP) that explores immersive and interactive 360 storytelling, is shutting down after six years. (Cartoon Brew)

Playing To Win: How Sandbox VR’s Founder Invested Everything To Build A New Virtual Reality World

Steve Zhao had a new vision: a virtual reality world where people could experience games by playing with friends, rather than by wearing a headset alone in their homes. (Forbes)

Read Full Article

Read for later

Articles marked as Favorite are saved for later viewing.
close
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview