How Special Effects Artists Use STEAM to Create Imaginative Visions

Special effects in film refer to a variety of different tools, tactics, and ideas. From real-life explosions that singe the eyebrows to digital dragons who swoop in and save the hero at the last second, it’s hard to imagine most movies without them.

Technology has dramatically changed the movie-going experience and how we create films. The movie industry will continue to change as new special effects technology is developed. Here’s how special effects artists tap into their STEAM (science, tech, engineering, art, and math) skills to create impressive movies that keep audiences coming back.

Computer-Generated Imagery (CGI) Brings Characters to Life

One of the most common forms of special effects is computer-generated imagery. From a character losing a leg in a horror flick to actors walking around a fictional world, CGI is almost impossible to escape at the cinema.

You might not realize it, but there’s a stark difference between CGI and animation, the team at Geek Extreme explains. The term animation historically meant “a sequence of hand-drawn cells that were played in order.” The 1928 short film Steamboat Willie, which is often considered the debut of Mickey Mouse, was a hand-drawn animated film.

CGI, on the other hand, was initially meant to describe adding digital elements to films. Today, most people refer to it as a movie rendered in computer animation software, often making it predominantly digital.

But it wasn’t always that way. YellowDog, a platform that speeds up cloud rendering for 3D and VFX (shorthand for visual effects), recently created a guide on the history of CGI and how it has evolved. The first instances of CGI started in MIT labs in the 1960s, and kept evolving through the 1970s and 1980s until the technology was ready for blockbuster hits.

Toy Story Created Countless Rendering Headaches

One of the most notable animated films on the list, and in recent memory is Toy Story. YellowDog reports that the production team thought they could make the movie with eight animators, and ended up with 33. The special effects artists also thought it would take 20 months to render the movie, when it actually took 300. The average frame took seven hours to complete and the final film had 114,000 frames. Most editors and producers are grateful that technology has evolved beyond those steps today.

This isn’t to say that is has been an easy road for CGI. Fans often love to make fun of bad special effects just as much as they appreciate good creations. Jonathan Figueroa at Screen Rant lists 15 movies in recent memory ruined by bad CGI. You don’t have to be a film expert to appreciate these tragic attempts to use technology to fill in the gaps of physical special effects.

Good CGI Can’t Make Up for a Bad Plot

At the end of the day, however, it doesn’t matter how good the CGI and special effects are if the plot is full of holes or the acting is terrible. All elements have to work together to create a great end product.

“We know how good CGI is when it’s done well, but at the core always needs to be a good storyline,” Bloggers Academy’s Patric Morgan writes. “Just as bad writing distracts the reader from the story, over-the-top or unrealistic CGI guarantees walkouts at the cinema.”

Audiences were impressed by Toy Story’s special effects, but they were also moved by the tale, which is why it’s still a favorite movie of both adults and children.

STEAM, Stories, and Movies

Motion Capture Technology Has Also Evolved

Along with CGI, movie directors are more often turning to motion-capture technology to bring their stories to life. The company Xsens brands itself as a leader in 3D motion tracking technology and explains exactly what these tools are and how they are used:

“Motion capture is a way to digitally record human movements,” they write. “The recorded motion capture data is mapped on a digital model in 3D software so the digital character moves like the actor you recorded.”

Essentially, an actor moves around, reacts to stimuli, and makes facial expressions, all of which are recreated by a digital model.

You Can See Motion Capture Technology in Today’s Box Office

Thanos, the villain of Marvel’s Avengers: Infinity War, is a digital creation built with the help of actor Josh Brolin and a team of special effects artists. In an article for Digital Trends, Rick Marshall interviewed their team to discuss how technology affected the production aspects of the film and even Brolin’s acting.

Visual effects supervisor Kelly Port said they used a combination of performance-capture technology and digital animation to create the villain. When Brolin saw how responsive the technology was to his movements, and understood he didn’t have to exaggerate for the results to show up on screen, he started making the character subtler. The end product is one of the most outstanding villains digital or otherwise on the big screen today.   

While Thanos is an impressive motion-capture creation, he’s hardly the first. Jordan Adcock at Den of Geek gives some of the highlights of this technology, including the first velociraptors in Jurassic Park from 1993, Jar Jar Binks in The Phantom Menace, and even Gollum in Lord of the Rings.

Gollum was originally supposed to just be a voice over, but when director Peter Jackson noticed how actor Andy Serkis tied his movements and facial expressions so closely to the character, he ended up putting the actor in a suit to capture and recreate them later.  

Many Believe Motion-Capture Actors Deserve Credit for Their Work

While these performances are impressive, many argue that the actors are not given their fair due. For decades, motion-capture stars have been passed over because they are not viewed as actors in the performance. Their characters change so much by the time they reach the screen, so the Academy Awards and other nominating committees treat actors like Serkis as part of the special effects team.

More people than ever are speaking up, arguing that motion-capture actors need serious acting chops and skill to play their roles.

“Is there really a difference between motion capture and extensive make-up and costuming?” Amy Chambers, Ph.D., asks in defense of Serkis. “Surely the Oscars can find a way of truly recognising the VFX teams while also celebrating the unique contribution of the actors.”

Similarly, Josh Brolin as Thanos changed his acting style because of the technology. Had he not, the character might not be as believable and the movie not as great as fans and critics say it is, which is a testament to acting skill.

Developers are tapping into machine learning tools to improve editing processes 

Machine Learning in Special Effects is On the Rise

While technology related to CGI and motion-capture tools is certainly impressive on its own, developers are also tapping into machine learning and deep learning tools to improve editing processes and make movies look even better. For example, the team at Foundry has two projects that make VFX easier with the help of deep learning:

  • The company trained a robot to identify various images and clean them up using technology similar to that used in editing since the 1980s. Instead of a human completing the task, however, a robot handled it.
  • To solve existing problems in VFX, deep learning tools (which can identify how lighting would look in various situations) reviewed thousands of images from different light examples and tried to recreate them in video editing.

Foundry isn’t alone in tapping into machine learning. The computer software company Arraiy is developing an AI-driven production platform for film and TV. According to the team at Futurism, the company is specializing in rotoscope, a “process of separating certain footage from the background,” such as removing an actor from a green screen.

If technology can complete this process in a few minutes (instead of the hours it takes a human special effects team), then a director can see how a scene looks before they leave the set, and can even order additional takes until they get the look they want.

Machine Learning Tools Are Still Works in Progress

Machine learning tools show us that we still have a long way to go to improve movie technology, and experts are quick to point out ways that AI still needs to advance.

“In recent years, machine learning algorithms have already been surprising visual effects practitioners with the quality of the image results,” Rajat Roy, global technical supervisor at Prime Focus World, tells Cinefex. “I think the forthcoming generation of algos will focus not on jumping directly to a result, but rather on building control assets along the way to enable artists to choreograph the performance.”

While today’s AI tools might come up with the best results based on history and math, future developers want to develop AI tools that come up with results based on artistic vision, or at least tools that allow artists to step in and create their ideal vision whether it agrees with the robot or not.

Some artists have noticed this progression and changed how they create their films. For example, the team at CinemaBlend reported that James Cameron waited to develop sequels to his popular Avatar film, which debuted in 2009 and even has its own Disney theme park, because the director needed technology to catch up to his creative vision. The next four films are going to be filmed simultaneously now that Cameron has the technology he needs to solve his previous problems.

Noticing where technology is, and where it’s about to go, can help special effects artists see what their future careers will look like, and maybe even what future movies will look like.

Digital Effects Aren’t the Only Way

While technology certainly paves the way for the development of outstanding movies, it isn’t essential for directors and actors to execute their creative visions. In fact, some people believe many directors and producers have seemly developed an over-reliance on special effects and editing technology.

Quartz reporter Corinne Purtill shared an anecdote where a makeup artist went to fix a blemish on an actor’s face. The director waved him off and said they could fix it in post-production editing. Something that takes less than a few seconds on set could take an hour in editing and might not even look better. Purtill begs the question: Would Jaws be the classic it is today had the shark not been an animatronic robot slashing back and forth?

Purtill isn’t alone. VFX writer Ian Failes says audiences still crave practical effects and real footage. They don’t necessarily want everything to be digitally mastered through CGI.

“Other studios behind 2017 releases have pronounced the extent of real photography involved in their movies,” he writes at Digital Arts. “The trailers for Wonder Woman and Dunkirk already have that sense of grit and grime designed to make them look more hand-crafted than some of the big effects-driven blockbusters.”

Keep in mind, Failes wrote this article at the start of 2017, before Dunkirk and Wonder Woman became two of the biggest films of last year.

Special Effects and Moviemaking Continue to Evolve

Chasing the newest technology isn’t a bad thing, and neither is using traditional methods. Directors, writers, and producers are always looking for the best possible ways to execute their visions and connect with audiences.

“When it comes to cinema, there is truly no ‘old’ way,” Kenan Proffitt at ActionVFX writes. “Every aspect of cinema was at one time new and groundbreaking; replacing something old.”

Images: NejroN/©123RF Stock Photo, ppengcreative/©123RF Stock Photo, Skitterphoto, dcondrey

Share this post

Share on facebook
Share on google
Share on twitter
Share on linkedin
Share on pinterest
Share on print
Share on email

You Have Signed Up Successfully

You’ve been added to our mailing list and will now be among the first to hear about new arrivals, big events, and special offers.