Skip to content
High Culture

Avatar 2 and beyond: Is CGI actually getting worse?

Technologically, the answer is definitely no. But that doesn’t mean CGI is always used to good effect.
cgi
Credit: 20th Century Studios
Key Takeaways
  • When CGI is convincing, it’s usually because the scenes include some elements from the real world to keep the fake imagery grounded.
  • Bad CGI is the antithesis of immersion. When the physics of a person’s movements seem unnatural, or when it’s clear that people or objects are not really in the environment depicted on the screen, you sense it, consciously or subconsciously.  
  • One common criticism of Hollywood’s use of CGI is that it was once a complimentary dish, but now it’s the main course. 
Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

A technologically unprecedented spectacle. That seemed to be the consensus opinion from audiences and critics alike when Avatar premiered in 2009. Nevermind the critiques about the blockbuster’s run-of-the-mill story — Avatar decidedly brought a new level of photorealism to sci-fi epics through computer-generated imagery (CGI), stereoscopic filming techniques, and state-of-the-art motion-capture technology.

That was 13 years ago. How has CGI improved since?

It depends whom you ask. One answer, or at least a 109-second glimpse of one, comes in the recently released trailer for Avatar: The Way of Water, which is set to release December 2022 and will be followed by three other entries in the series. 

Commenting on the CGI featured in the trailer, reactions ranged from “stunning” to “looks like a crappy cartoon.”

On the more objective side, the $250-million sequel does seem like it will up the ante on CGI through its use of new motion-capture technology, where actors wear suits outfitted with markers so computers can record their movements and overlay digital animation on top of them.

The motion-capture tech in the Avatar sequel was specifically designed to cleanly capture actors’ movements underwater. Director James Cameron told Collider that it took more than a year for the new system to overcome the optical problems that cameras face when trying to film underwater; he said that the interaction between air and water creates a “moving mirror [that] reflects all the dots and markers” that actors wear as part of motion-capture suits.

But why go through the hassle of filming underwater — not to mention, training actors to hold their breath for minutes at a time — when visual effects artists could have simulated the underwater scenes through CGI? 

Fusing the real with the fake: When CGI is convincing, it’s usually because the scenes include some elements from the real world to keep the fake imagery grounded, such as real people or real environments. 

Compare the new Avatar trailer to scenes from the 2018 movie Aquaman, for example. Unlike Avatar, the Aquaman filmmakers shot the “underwater” scenes using a technique called “dry for wet,” in which actors were suspended from wires and filmed against a blue screen in a studio. Lost in this technique are the more realistic physics and lighting effects gained from filming in actual water. 

The same rule applies above ground. In 2021’s Dune, much of the film’s outdoor scenes were shot in the natural sunlight of the deserts of Jordan and the United Arab Emirates, imbuing the shots with realistic lighting. To further boost realism, the filmmakers used actual helicopters to depict the ornithopters in order to capture how the aircraft kick up sand in real life. 

CGI was obviously used to render the ornithopters, giant sandworms, and other impossible shots. But the filmmakers inserted these visual effects after filming in real environments against a “sandscreen” — a greenscreen modified to better match the colors of the desert.

In contrast, the visuals of completely computer-generated environments tend to look unrealistic. A character’s face might maintain the same lighting even though they just stepped into a dark corridor, or bombs are exploding around them.

Physics also helps to keep CGI grounded. With motion-capture technology, for example, digital graphics are overlaid atop the facial and body movements of real people, as was used in films like Avatar, War for the Planet of the Apes, and Avengers: Infinity War. 

But when characters are rendered completely through CGI, as was the case in one scene from the final battle in Black Panther, the results can look cartoonish. That’s because it essentially is a high-budget cartoon. 

After all, no cameras are used in scenes that are completely computer-generated. That can make for a jarring viewing experience, where the audience’s point of view is moving around in ways that would be impossible for a conventional camera rig.

So, is CGI getting worse? Technologically, the answer is definitely no. Compared to 2009, visual effects artists have access to better and more affordable software and motion-capture techniques. But that doesn’t mean CGI is always used to good effect. 

CGI overuse: One common criticism of Hollywood’s use of CGI is that it was once a complimentary dish, but now it’s the main course. 

In 1993’s Jurassic Park, for example, Steven Spielberg decided to use CGI instead of stop-motion animation (the more conventional method at the time) to render about half of the dinosaurs seen in the movie. It was among the first movies to convincingly depict animals through CGI.

And, surprisingly, the CGI still holds up. Compare the 1993 movie to 2015’s Jurassic World: which feels more immersive? Jurassic World might have slicker CGI, but Jurassic Park is arguably more immersive — a feat achieved through balancing some pretty impressive CGI with controlled animatronics, puppets, and clever photography. 

To put it more objectively: Jurassic Park featured about six minutes of CGI, while Jurassic World included about 40 times as many CGI shots.

That’s not to say that more CGI necessarily equals a worse movie. It’s more about how it’s used. After all, most modern films, even romantic comedies and dramas, use CGI in ways you might not even notice, such as to edit props out of the background, populate an empty stadium with roaring crowds, or depict a car accident. 

Even today’s best practical-effects movies use CGI. For example, 2015’s Mad Max: Fury Road was lauded specifically for not leaning on CGI and instead opting for good old-fashioned stunts, pyrotechnics, and practical effects, even though about 2,000 of the movie’s 2,700 shots contained CGI in some form. 

Why did it go over so well with critics, many of whom seemed exhausted with CGI around 2015? It could be that the CGI was used to complement (mostly) real shots, not to make reality itself. 

Here’s how Tenet director Christopher Nolan put this concept to the Directors Guild of America in 2012: “There are usually two different goals in a visual effects movie. One is to fool the audience into seeing something seamless, and that’s how I try to use it. The other is to impress the audience with the amount of money spent on the spectacle of the visual effect, and that, I have no interest in.”

Money and coordination problems: Granted, movies like Fury Road and Avatar were backed by visionary directors and mega budgets. That’s a rarity. For lower-budget movies, or movies for which studios want to spend as little as possible, CGI often winds up looking clunky and unrealistic. 

It’s typically not the fault of visual effects artists. The problem lies in the business model. At the beginning of a movie’s production, a studio might contract a visual effects company to generate 500 shots for $10 million as part of a “fixed bid.” 

But the studio will often demand that changes be made to those 500 shots, for which visual effects artists might spend 100 hours per week working on. Why? It could be creative decisions, misunderstandings, production setbacks, or the fact that some modern movies begin production without a definitive third act in mind.

Whatever the reason, the studio does not pay additional money for those changes to be made; from their perspective, they paid for 500 shots, changes or no changes. Ultimately, the visual effects company gets stuck paying the extra expenses and doing extra labor. 

To visual effects artist Jack Fulmer, this uncoordinated and overly demanding workflow often makes movies worse.

“If you’re not a visionary, or if you don’t have a visionary involved in a project, and it relies heavily on visual effects, it’s not going to succeed,” Fulmer said in Life After Pi, a 2014 documentary that overviewed the visual effects company Rhythm & Hues Studios, which won an Academy Award for its work on 2012’s Life of Pi just weeks after declaring bankruptcy. 

There could be a very simple reason why CGI might seem as if it’s getting worse: we’re no longer impressed by it. After enough action blockbusters and entries in the Marvel Cinematic Universe, we might not notice incremental improvements in CGI, or maybe we only notice CGI when it’s glaringly bad.

Bad CGI is the antithesis of immersion. When the physics of a person’s movements seem unnatural, or when it’s clear that people or objects are not really in the environment depicted on the screen, you sense it, consciously or subconsciously.  

What remains to be seen is whether CGI will become so polished that these optical stumbling blocks are no longer an issue. To Christopher Nolan, that’ll never happen. 

“However sophisticated your computer-generated imagery is, if it’s been created from no physical elements and you haven’t shot anything, it’s going to feel like animation,” Nolan told the Directors Guild of America.

Then again, nobody ever complained about the lack of hyperrealistic physics in Toy Story.

This article was originally published on our sister site, Freethink.

Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Related

Up Next