Computer Generated Imagery (CGI): VFX Rise or VFX Plateau?

Shaan Gaurav
13 min readFeb 8, 2020
Images Courtesy of 20th Century Fox and Walt Disney Studios

Computer generated imagery, or CGI, is now utilized so much that it’s usually impossible to determine what imagery was actually filmed, and what imagery was pure fabrication. Nearly every major movie in the last decade has had at least some CGI used. But many people today often badmouth filmmakers who think CGI excuses them from their obligation to deliver a proper narrative and well-developed characters. And filmmakers go to extremes to prevent visual effects from looking poor against a real background — forcing filmmakers to sacrifice their own films with suffocating filters and obnoxious color grading so all the elements blend together easier. Even real things in some movies look artificial. This therefore begs the question, have we reached a CGI plateau?

It would seem that despite some setbacks, CGI is still getting better, and it is going to keep getting better. The real problem, however, is that the VFX industry is so over-compartmentalized that even the simplest wire removal has to go through something like ten different people, when something like that could be done with free software in roughly two hours. The reason why they do this in the first place — overbloat the production with personnel and production — is often time. The most important element is time, whether it be CGI or practical. Investors push deadlines to garner a quick return on their investment, therefore CGI is leaned upon due to it being faster to achieve results for investors who probably aren’t cinephiles. Practical effects can look bad as well if done hastily. Both require time to add polish. That said, CGI effects feel heavily overused today. Practical effects should be used wherever they can be, with CGI there to fill in the gaps. No matter how good CGI can be made to look, too often it comes across as awkward.

Of course, it’s important to note that many great movies would be different or not made at all if CGI hadn’t come to pass. And it’s not as if using it is easy. CGI requires just as much artistic merit and creative thinking as model work does. Just look at digital effects in Rise of the Planet of the Apes, Terminator 2: Judgement Day, or Jurassic Park. There are just some things practical effects are not capable of doing. The CGI in those movies are so photorealistic it’s hard to believe that little to none of it was practical. Even people who do practical effects will say that the best look you can get is to start practical and slowly add small CG details.

Shooting something in real life is always easier than fixing it all in the CGI. That’s why when we are doing a composite shot, we prefer to have as many real backgrounds or real elements as possible. CGI is great for the impossible effects and that is what it should be used for and the rest should display the craftsmanship of practical effects.

There is plenty of good usage of CGI in movies. The real problem people are having with its use, though, is that there is too much mediocre CGI out there. It isn’t properly applied to movies enough. The goal is to fool the audience into thinking that what they’re seeing wasn’t computer generated. Overuse of CGI and VFX — or CGI and VFX that’s too obvious — ruins the illusion, so the audience knows what they were seeing was fake.

People don’t usually bash the artists and animators (unless they overuse it so much it becomes almost a defining trait — as with Michael Bay and even JJ Abrams’ “lense flares”), but hate the overall overuse of CGI in film when it’s just plain unnecessary, and when practical effects would look much better. For example, in a web article published by Popular Mechanics, it is explained how the team behind Transformers (2007) actually constructed a full-sized Bumblebee puppet for certain scenes. In the first movie everything felt more real due to rust being purposely incorporated into Transformers to feel more “real” — in the sequels the designs became more shiny and polished, and everything feels more fake. Bumblebee is even first introduced as a rusty old Camaro. CGI was used for the vast majority of scenes featuring them, but many of their interaction on scene, and especially the interaction of people on the surrounding, was done as practically as possible. This is to try to get the CGI to feel real, not just look real. But in some instances, CGI still looks real but feels fake.

Before CGI Went “Bad” — And Why It’s Good

Image Courtesy of T2 Productions

The limitations of the technology in the early 90s are what really pushed the most creative filmmakers to think laterally to come up with solutions. This is why these films endure far better than much of the CGI we have in the early-mid 2000s, such as Jurassic Park and T2: Judgement Day, because the filmmakers had zero concept of restraint and were unable to grasp that SFX are a tool for advancing plot, not plot in itself. Granted, during the 80s CGI really had begun to develop with films like Tron but was still considered niche.

Innovative directors of the 40s and 50s like Edward Dmytryk worked in the same way; finding ways around the limits of their technology to present a compelling story with a focus on plot. James Cameron is another example in the 80s and 90s, and even today with his use of CGI being praised in such films as Avatar and Alita: Battle Angel. And David Fincher is another example of a good craftsman who uses CGI sparingly to advance plot without overdosing the audience, like with Zodiac.

James Cameron first dabbled with CGI in The Abyss with the “water tentacle” that formed a face, which became the stand-out scene of that movie. It took months to render that scene. They had to learn about how to animate, surface reflection, facial data; it was a much more involved process than by today’s standards as CGI was still a relatively new concept. But perhaps most importantly the actors had to read to it as if it were actually there. Because, in Cameron’s own words, “The effects no matter how good they are, are only believed by the audience if the characters in the scene appear to be believing in the effects.”

But it was really Cameron’s Terminator 2: Judgement Day that became such a breakthrough in computer graphics. This took the art and science of special effect to a whole new level. The concept of a “liquid-metal man” (T-1000) was an idea that James Cameron had roughly ten years prior to making this film. There was no way to conceptualize it, and Claymation was seen as the best option back in the 80s. The Abyss was proof-of-concept to Cameron that his vision was now achievable. Even so, it was a huge gamble because CGI was still unproven as a credible filmmaking tool and they were given a short timeframe to work with. T2 was special at the time it came out because it was the first time CGI and practical effects were to be combined to work in unison. In fact T2 only had roughly 47 or so CG shots.

It was a very delicate balancing act, determining what was going to be done optically and what was going to be done as a real life effect, and what things would be needed for ILM to help them work things out optically and vice versa. When the T-1000’s abilities went beyond what was possible with animatronics, makeup, and prosthetics, ILM was tasked with creating the effects optically, redefining the standards for CGI. It was initially a way of coming up with new effects.

If ILM had not done The Abyss and T2, they wouldn’t have done Jurassic Park.

Speaking of which…

In Making of Jurassic Park, Spielberg explains that he wanted to bridge the CGI dinosaurs with the mechanical ones so that the film would completely immerse the audience. Industrial Light and Magic convinced Spielberg that CGI was the way to go on Jurassic Park. This was revolutionary not just for what audiences would come to expect to see in movies, but also in how it would assist filmmakers get their movies made.

When making Jurassic Park, ILM would have to break a lot of new ground for the film. The process that these filmmakers went through to make dinosaurs for Jurassic Park was a milestone in VFX. Prior to this film, the best visual effects seen in movies like The Abyss and Terminator 2: Judgement Day. They were impressive to be sure, but they didn’t come anywhere close to creating an entirely photorealistic animal that behaved and moved like the real deal.

Of course arguably the biggest leap in VFX in recent memory has to be James Cameron’s Avatar. When making Avatar the intent was to capture the whole performance of the actor in order to get their subtle nuances. Avatar was the first time that real-time performance capture was employed in a direct filmmaking sense. They could be very improvisational because they weren’t bound by a physical set. Cameron had more flexibility on the virtual production stage than he had during the live-action shoot. He wanted to reproduce full human emotion in a CG character. Avatar would fail if they didn’t solve the dilemma of the “dead eye.”

And Now?

Following the tremendous success of The Abyss, Terminator 2: Judgement Day, Jurassic Park, and Avatar, with that came a misunderstanding of said success among Hollywood. People were under the impression if they simply made a film that had those CGI effects in it it would automatically be successful, which as we know now isn’t necessarily the case. But the movies that tried to imitate them weren’t breaking new ground or trying to use them in new ways. They were just empty cash grabs.

Now that Hollywood is entirely about box office and gross receipts in overseas markets like the Chinese market (where CGI sells and dialogue is irrelevant because it gets subtitled anyway) you will be seeing less filmmaking of the innovative kind and more of this cash-grab filmmaking, with a thin story hung between 1000 CGI shots, and storytelling and writing have taken a backseat to showy visual effects. Not that I am against CGI if it is used as a story adjunct. But watch 12 Angry Men, or The Verdict, or any compellingly written film, and you will learn that good writing and good dialogue are infinitely more powerful than the most expensive CGI setup.

This “it’s all about CGI now; the story doesn’t matter anymore” tripe is just a very tired, cherry-picked, selective dumb argument that doesn’t really hold any water. Of course that doesn’t stop people from parroting it… because it “feels true.”

What is true is that there have always been studios and filmmakers that like movies for the spectacle, not the story. That isn’t the fault of CGI — we humans tend to project and blame others and outside entities. The problem, as always, is mediocrity — some have it, some don’t. There are lots of filmmakers that use CGI to great effect in otherwise solid, awesome movies.

The main thing is that it’s easy to do CGI now, but it’s super hard and time consuming to do it well. Lighting and border effects is the big one. People complain about noticeable CGI, but what I’m seeing more and more of is how much CGI goes unnoticed and therefore not complained about. Nearly every major movie in the last decade has had at least some CGI used and am curious just how movies CGI so little that no one notices it.

Avengers and Star Wars: CGI Done Seamless?

Images Courtesy of Marvel Studios, LucasFilm, & Walt Disney Pictures

Some of the most CGI heavy films in the modern film climate are the ones released by LucasFilm and Marvel Studios. Yet, quite interestingly enough, in essentially every critical review for Star Wars: The Force Awakens, Star Wars: The Last Jedi, Avengers: Infinity War, and Avengers: Endgame, there is not one mention of the blatant use of CGI in any of them. Have these films finally achieved with modern VFX that many others are still struggling with- CGI as a “normal” tool in filmmaking?

In stark contrast with The Lion King remake, it is even debatable whether it can truly be considered “live action” as every single character onscreen is a CGI creation. In reviews for the remake, practically every review that mentions the VFX praises it but heavily criticizes it for lack of engaging plot or story, or that it offers nothing new beyond photorealistic animals (that can’t convey the same intense emotions as their 2D animated counterparts).

This then begs the question, is the use of CGI Independent of critical reception? For example, The Last Jedi received heavily mixed reviews, but still no one mentions the CGI of that movie as being a problem; their issues are only with the handling the characters and story. Some people dislike the Marvel movies for being what they feel is too formulaic but they don’t focus any of their criticisms towards the CGI that those films use, merely the plots and stories themselves.

Alita Battle Angel: An Unfair CGI Bias?

Image Courtesy of 20th Century Studious, formerly 20th Century Fox

A little background of my experience with the following movie: In early February of 2019, I saw the movie Alita: Battle Angel, without knowing that much about it at the time. I had low expectations and was just there to see something for the sake of seeing something, not to specifically see Alita: Battle Angel. But I was genuinely, pleasantly surprised.

In the early advertisements for Alita: Battle Angel, like many other people, I was concerned about the big eyes of the main character. The general notion to me was that this was going to have the effect of “Superman’s Upper Lip,” in that it would be a jarring distraction. More specifically, I was concerned that they were taking the wrong influence from the source material. In Manga and Anime big eyes are commonly used to express deeper emotion or thought in a character; to make us sympathize with them and understand them better. This is also a technique utilized in many Disney 2D Animation and CGI Animated films. But I just couldn’t see this translating well into “live-action.”

And yet in the end, I left the theater feeling that they had pulled this off seamlessly. While I admittedly did find Alita’s eyes somewhat odd and detracting at first, they grew on me as the movie progressed. They used the eyes to draw attention to how different she is from everyone around her, in her attitude, capabilities, and origin for that matter. I can see now why they used that stylistic choice. It makes her stand out from every other character in the movie, even when she’s dressed in casual clothing and is playing alongside other teens. In fights where everything happens so fast, her eyes make her expressions clear and readable, which is an important cinematic aspect. This CGI “live-action” version of Alita is thus very mesmerizing & highly likeable.

In other words, the thing I expected to turn me off actually drew me more to the character of Alita. Upon completion of my viewing it occurred to me that I (and many others) had a predetermined mindset that the movie would be underwhelming or disappointing, or even outright bad, and that we had based so much of this solely on the CGI we saw in the trailer. This forces me to then ponder if we have come to somewhat unfairly judge CGI purely for being used in a movie.

Alita is so lifelike not only because she is the one of the most advanced 3D models ever created to date, but her every body and facial movement is done through motion performance by a real actress whose features are merged with the model. The blend of real shots and CGI are shot very vividly & catchy, with nods to the original source material (as I would come to learn in the From Manga to Screen featurette).

Seeing the movie on IMAX 3D I can say it definitely had to be seen on the big screen. 3D movies these days are too forced on the 3D aspect of the visuals. Alita was probably one of the few instances that it was genuinely adding depth to the overall experience of the movie. Clearly, Rodriguez and Cameron really wanted to make Alita: Battle Angel the most faithful film to the source material as possible and the movie shows it, with the over-the-top style aesthetic making the world feel real and bold (especially given Rodriguez’s previous experience with CGI in the Machete and Spy Kids films).

Overall, this was a fun viewing experience, and I feel to justify extensive CGI use in films moving forward, we need more films like this one; ones that try to utilize the CGI in a way that feels organic to the story and world, along with directors and writers who care deeply for whatever it is that they are making.

Final Consensus

CGI blurs the line between the virtual world and reality. Highly controlled and curated, these digital creations evoke a strong interest among moviegoers. They may not be real but their impact is. The best thing for the future of the film industry is to keep a happy medium of both CGI and practical effects. Take for example filmmakers like Guillermo del Toro, Christopher Nolan and JJ Abrams. They are all great directors who know how to use both CGI and practical effects to the best of their ability. There is indeed a lot of bad CGI out there, and lots of filmmakers use it when they don’t need to, which is really upsetting. However digital effects can be very helpful when done right and I honestly think that if more filmmakers learn to properly combine both practical and CGI effects, we will get better movies going forward. The success of a movie does not depend solely on computer-generated digital effects. The story, preparation, length of filming and the director’s vision also play a big part in the success of the movie. So to answer whether the use of CGI a VFX rise or VFX plateau, ultimately that will be determined by just how the majority of filmmakers will utilize it.

--

--